Quantcast
Channel: Teradata Forums - All forums
Viewing all 27759 articles
Browse latest View live

How to load EBCDIC file to db using TPT? - response (2) by feinholz

$
0
0

On which version of z/OS are you running?
(We only support loading EBCDIC data on the mainframes.)
 


Precision loss during expression evaluation - data type issue? - forum topic by tstrick4

$
0
0

Hi, when I execute the following SQL

 sel doc_id
	, prd_id
	, prd_cmn_name
	, net_quant
	, mvt_gravity
	, cast(net_quant*42*(mvt_gravity*8.338426855)-0.0101578 as DECIMAL(19,13))
from monroe_prd.movements 

I am getting the following error:
SELECT Failed 2614: Precision loss during expression evaluation
Relevant table DDL is below

CREATE SET TABLE PRD.MOVEMENTS
     (
      DOC_ID DECIMAL(12,0) TITLE 'SRA Document ID',
      PRD_ID CHAR(4) CHARACTER SET LATIN NOT CASESPECIFIC TITLE 'Product Code',
      PRD_CMN_NAME VARCHAR(30) CHARACTER SET LATIN NOT CASESPECIFIC TITLE 'Product Name',
      NET_QUANT DECIMAL(9,2) TITLE 'Net Quantity',
      GROSS_QUANT DECIMAL(9,2) TITLE 'Gross Quantity',
      MVT_GRAVITY DECIMAL(7,6) TITLE 'Gravity Spec',
      MVT_API DECIMAL(10,7) TITLE 'Movement API',
      CREATEDATETIME TIMESTAMP(0) TITLE 'CreateDate',
      UPDATEDATETIME TIMESTAMP(0) TITLE 'UpdateDate')
UNIQUE PRIMARY INDEX ( DOC_ID ,DOC_NUM )
INDEX ( PRD_ID ,MVT_DATE );

After reviewing the Messages manual, I have tried casting the result as 'FLOAT' and many other combinations of DECIMAL(m,n) and get the same error.  I did the equation in a graphing calculator for a specific row and came up with a DECIMAL(19,13) answer, which is why i've been trying to cast as that data type.  Ultimately I'd like to round the answer to 4 decimal places, but I have not yet gotten to that point.
Thanks in advance for any ideas that you have

Forums: 

Teradata Training Material available - response (90) by snanda3

$
0
0

Hi Todd,

I am new to Teradata,learning by online blogs
Requesting you please share with me the Teradata material on the following mail id.
sunil.nanda.east@gmail.com.

Thank you
Sunil Nanda

Precision loss during expression evaluation - data type issue? - response (1) by VandeBergB

$
0
0

The 2614 error says that the result is too small.  I plugged some literals your  cast stmt, and got the same errors.  When i force the order of operations and limit the values to four decimal places (you said you wanted to round to four places) it works.
Not sure if you want this evaluated left to right, but ...

SELECT 123456.33*42*(1.888333*8.338426855)-0.0101578

generates a 2614 error, casting that as float generates the error as well.

SELECT CAST(123456.3333*((42.0000*(1.8883*8.998))- 0.01015) AS FLOAT)

extending the integer of 42 to 42.0000 and limiting the other values to four decimal places returns a result....

SELECT CAST(123456.3333*((42.0000*(1.8883*8.338))- 0.0101578) AS FLOAT)

interesting conundrum...

SELECT CAST((123456.3333*42*1.8883*8.3384269) - .0101578 AS FLOAT)

the last row works as well.
What does your data profile look like, for the columns involved in the calculation?

TASM Ruleset Export/Import - forum topic by mikesteeves

$
0
0

I'm trying to export our current TASM Ruleset from our production system and import it onto our staging system.  I keep receiving the following error.

 

Cannot import ruleset: The import file could not be parsed

 

Anyone know why?

Thanks,

            Mike

Forums: 

TASM Ruleset Export/Import - response (1) by mikesteeves

$
0
0

In Viewpoint, which monitors both our prod and stage systems, I can right-click Export the Ruleset in Prod and save it to my machine.  However, when I attempt to import that saved file is when I receive the error.
Cannot import ruleset: The import file could not be parsed

SAS Clinical Online Training Inidia - forum topic by goodtraining111

$
0
0

We are providing Exclusive  Online Training, Faculty from top MNC’s with highly skilled domain expertise will train & guide you with real time applications and  We also help you in resume preparation and .we will provid more information visit my site.
 
www.goodonlinetraining.com
 +91 996 395 7366     
Email : knaveen.sas@gmail.com
Email : info@goodonlinetraining.com
 
Good online training is the best online institute will provide the  SAS online trainings by certified professionals . Get certified with our well experienced certified Trainers.We offer Good Online Training Courses. We also offer them good quality tips to improve their acquaintance more in the software training Courses & Computer IT Training at online to contend with today competitive software world. We train the students in different models of directions to be the paramount in the online software training field as to make them finest amongst with latest technology. We have highly practiced and proficient faculty to polish the trainees in a contemporary manner to amplify their style of learning and grabbing the existing issues as fine as that helps them to grow intelligent in this field.
 
goodonlinetraining.com SAS Online Training will be given by SAS Base and Advanced Certified professional.Get SAS certified with our SAS online training course. SAS Training online is based at Hyderabad, India.knaveen.sas@gmail.com Call us +919177856619 for attending free Demos. SAS Online Training has taken utmost care to recruit for online training.
 
SAS – Statistical Analysis System - is the one of the most widely used Statistical
Software package by both Industry and Academic circles. SAS software is developed in
late 1960s at North Carolina State University and in 1976 SAS Institute was formed .
SAS courses is a powerful statistical package sas courses can run on many platforms, including Windows and Unix.The SAS system is a collection of products , available from the SAS Institute in
convert intellectual property from discoveries into Goodonlinetraining SAS software is a combination of a statistical package, a data – basemanagement system, and a high level programming language. By providing a way to access, manage and analyze data more efficiently and effectively, SAS solutions enable life sciences in to manage risk and maximize profits;effective drug treatments; and apply the power of business analytics to make commercialization programs even more successful.We provid more Informatin visit my site: http://www.goodonlinetraining.com/sas-online-training
 
 
 
 
 

Forums: 

Set up of Standard Archive & Purge main frame Jobs in teradata? - forum topic by Nishant.Bhardwaj

$
0
0

 Hi all,
Can any one pls share any doc or case study or if some one can explain the process how to set up the standard archive & Purge main frame jobs in teradata ?
thanks!
cheers!
Nishant

Forums: 

How to determine whether TotalIOCount is significant or not - forum topic by nyemul

$
0
0

Hello,
 
After running a set of queries, it was found that, there are various values of TotalIOCount.
These values range from few hundreads to as high as 23 * 10^6.
On a system with 14 nodes and 504 AMPs, how to we say whether these values are significant, not - significant etc?
 
Niteen

Forums: 

Teradata Online Training India - forum topic by goodtraining111

$
0
0

Teradata ONLINE TRAINING
Good online training offers the best Teradata online training with the industry experts as their trainers. All the experienced certified tutors will share their experience, tips and tricks in the SAP SD Online training course. We will provide More informations please contact us :
www.goodonlinetraining.com
 +91 996 395 7366     
Email : knaveen.sas@gmail.com
Email : info@goodonlinetraining.com
Teradata Online training courses Content:  
Introduction:

  •  What is Teradata?
  • Teradatav12 & versions Other versions
  • Overview OF Teradata
  • Teradata warehouse
  • Teradata in the Enterprise
  • Teradata Users & Scalability
  • interview Q&A and Certification queries

Teradata Architecture and Components

  • Teradata Architecture
  • SMP Architecture
  • MPP Architecture
  • AMP(Access Module Processing)
  • TDP(Teradata Directory Program)
  • CLI(Call Level Interface)
  • TPA(Trusted Parallel Application)
  • Parallelism Architecture
  • Benefits and Types

Data Recovery and Security

  • Why for locks &how to release
  • Various locks for simultaneous access
  • RAID 1&RAID 5
  • Disk Arrays
  • Fall Back
  • Clique
  • AMP Clustering
  • Types OF Journals
  • Recovery Journal
  • Trasient Journal
  • Permanent Journal
  • Before Journal
  • After Journal
  • How & where is used  journals
  • Answering various recovery questions

Teradata Indexes 

  • Primary Index
  • Unique
  • Non unique
  • Partitioned
  • Secondary Index
  • Unique
  • NonUnique
  • Hash, Join, Value Ordered
  • Skewness
  • Secondary Index Sub table
  • Accessing Records via Primary Index
  • Accessing records via Secondary Index
  • Keys Vs Indexes
  • Full Real time scenarios and explaining

Teradata Sql Reference

  • SQL Fundamentals
  • Data Types and Literals
  • Data Definition Statements(DDL)
  • Data Manipulation Statements(DML)
  • Explaining with proper examples

Teradata Functions and Operators

  • String Functions
  • Format Function
  • Cast Functions
  • Group & Aggregate Functions
  • With & with by clauses
  • Practices of this section

Teradata Transactions
Implicit Transaction
Explicit Transaction

  • Performance Tuning and Explain Utility
  • Explain Usage
  • Collecting Statistics
  • Tuning SQL Performance
  • Usage of PMON
  • Explaining various SQL statements
  • Joins and Unions
  • Inner Join
  • Left Outer Join
  • Right Outer Join
  • Full Outer Join
  • Detailed explanation
  • Join Strategies
  • Product Join
  • Merge Join
  • Hash Join
  • Nested Join
  • Questions of this section

Teradata Basic Commands

  • HELP
  • SHOW
  • EXPLAIN
  • COLLECT STATISTICS

Teradata Objects

  • Tables
  • SET table
  • Multi Set table
  • Volatile tables
  • Global Temporary tables
  • Derived tables
  • Views
  • Macros
  • Stored Procedures
  • Triggers
  • Practices and FAQs of this session

Teradata spaces

  • PERM space
  • SPOOL space
  • TEMPORARY space
  • Teradata User and managing
  • Practical Examples

Teradata Transaction Modes

  • BTET
  • ANSI
  • Interactive
  • Batch

Teradata Sql Assistant(Queryman)

  • Teradata Performance Monitor
  • Teradata BTEQ
  • Batch Scripts with samples
  • Branching and Looping
  • Importing data through scripts
  • Exporting data through scripts
  • Error handling
  • Explanation with proper debugging
  • Teradata Fast Load
  • Various Phases of Fast Load
  • Advantages and Process
  • Limitations of Fast Load
  • Sample Scripts

Real time Issues and resolving it

  • Teradata Multi Load
  • Various Phases of Multi Load
  • Limitations of Multi Load
  • Working with multiple tables
  • Applying various operations
  • Sample Scripts

Real time Issues and solving it

  • Teradata Parallel Data Pump
  • Limitations of TPump
  • Overview of TPump
  • Sample Scripts
  • Teradata Fast Export
  • Exporting Data
  • Passing Parameters
  • OUTMODS
  • Sample Scripts
  • Utility Vs Parallelism

We will provide More Informations viist my site: http://www.goodonlinetraining.com/data-warehousing/teradata-online-training
 
·  http://www.classified.freeclassifiedwebsitelist.com/

Forums: 

#Records estimation for Derived tables. " Too many rows" error - forum topic by mjasrotia

$
0
0

HI All,
 
We are facing a very frustrating problem in our reporting warehouse. We have given an adhoc BO universe to the business users. Users make their choices and execute the reports. Some of the metrics are derived using derived tables. If we have a static query we can always have the stats collected on the joined tables and the optimiser will have the high confidence records estimation and we can still control it somehow analysing the used filters. Now the problem we are facing is that since we can not collect stats on the derived tables so the optimiser is estimating records counts in millions which in turns fails the DBA set criteria in terms of the maximum records allowed hence we are getting "Too Many Rows" error and the queries are not executing. 
 
Does anybody else faced the same issue. If yes how do we go about resloving this. Is it really suggestible to put a limit on the no of rows rather than on the time taken ?
 
Please suggest
Thanks
Manik

Forums: 

REPLACE VIEW Failed. 6922: Illegal or unsupported use of subquery/derived table inside a recursive query/view. - forum topic by bikky6

$
0
0

REPLACE VIEW Failed. 6922:  Illegal or unsupported use of subquery/derived table inside a recursive query/view.

I have my query something like the below
REPLACE RECURSIVE VIEW <DB>.<VIEWNAME> (COL1, COL2, COL3,ROW_NUM,DEPTH) AS
(
SELECT 
COLUMNS..
FROM  X
LEFT OUTER JOIN   K2
ON X.COL1=Y.COL1
AND COL3='ANYWHERELAST'
WHERE COL2 IS NULL
QUALIFY ROW_NUM=1
 
UNION ALL
SELECT 
COLUMNS..
FROM <VIEWNAME>
LEFT OUTER JOIN  K2
ON CONDITIONS
AND  ....
WHERE K2.COLUMN IS NOT NULL
AND NEWDEPTH <= (SELECT COUNT(*) FROM Z WHERE  COL2='SOME CONDITION')
);
 
The problem is with the subquery in the second part of UNION ALL.How to resolve this part?
 
Thanks in advance
KVB

Forums: 

Data labs - Default permissions - forum topic by doneill@westpac.com.au

$
0
0

Our Production Teradata environment consists of several business silos across the organisation, each silo comprised of it's own set of databases and permissions. We're setting up datalabs in order to allow users access across these silos. These Datalab users will have read access across multiple silos.
A user created a view of a production table in a Data lab. On executing "Sel * from <viewname>, they received a "An owner referenced by user does not have SELECT WITH GRANT OPTION".
There are at least two possible solutions available:
1. I "grant sel on <prod DB> to <Datalab DB> with grant option" , or
2. I grant it "ALL <Datalab Parent DB>" (under which all our Datalabs reside
So my question is "Is there a best practice method for this scenario when using Datalabs

Forums: 

How to load EBCDIC file to db using TPT? - response (3) by Marcus1

$
0
0

I'm not using a mainframe.

I need to load a given EBCDIC file into Teradata on a network attached client. The EBCDIC file does not contain any packed columns, but does have columns with signed values.
I'm using TPT V13.10 on my TD notebook (or on TD VM Linux) and my test db is a remote TD 14.00 VM.
If it's not possible with TPT how about ML or FL?
 

what does UTY8709 mean in fastexport. - forum topic by robertskin

$
0
0

Hi, Guys
 
I am using fastexport to export my db result, but from the logs, there are some logs show like:
**** 18:42:33 UTY8722 234664 total records written to output file.
which means the records is write to the output file, but there are some logs show like:
 

**** 18:38:07 UTY8709 Restart in progress, processing for this FastExport task

     has already been completed.

 

which seems like there is something wrong with the export and i have checked that there is no records there.

 

Could anyone tell me what does the message UTY8709 mean? or any clue is appreciated.

Forums: 

Perfect Query Timings needs to be captured - forum topic by Jessy Mahesh Kothapalli

$
0
0

Perfect Query Timings needs to be captured.
 
Hello, we are using Teradata13.10 Version and using Viewpoint also. We are using OBIEE 11g tool to refreshing the reports.
Daily basics we runs 10000+ reports against to Teradata by using PROD_OBIEE_USER user (Only one User-DBQL log is enabled)
As per client requirement, how to analysis how many queries were completed within 10mins?
                                                                                                                          Between 10mins to 20min?
                                                                                                                          Between 20min to 30 mins?
I was trying to take metrics from DBQL, It was not showed correct results?
Please help me out from this situation, Please provide me valid query for checking above timings?
 
 

Tags: 
Forums: 

TD utilities and JAVA. - response (5) by mayanktiwari

$
0
0

Thanks Ulrich and Tom.
 
Is there any possibility to execute a program using BTEQ or any other utility on UNIX environment. ?
I want to run a JAVA program which will run TD queries and fetch out the results from TD itself.
I think this is more clear to you about my requirement.
 
Thanks in advance for further responses.

Data Warehouse Testing - response (7) by ruchir85

$
0
0

Hi,
I have been involved in manual ETL testing for past 3 years using Sql ,data extraction and data comparison .
Could someone pls let me know the usage of teradata tool in ETL testing.

Teradata Training Material available - response (91) by prashanthmspk

$
0
0

Hi Todd,
It is very kind of you and appreciate your helping nature.
Please mail me the materials as am planning to take the developer certification. Mail: prashanthmspk@gmail.com
 
Regards,
Prashanth

Perfect Query Timings needs to be captured - response (1) by VandeBergB

$
0
0

The starttime column in the dbc.dbqlogtbl table is when the query started. FirstRespTime is when the query was completed and the first row of the resultset was returned to the user/application.
Your query, depending upon what other fields you need from dbqlogtbl will look something like this...

SELECT (FIRSTSTEPTIME -  Starttime) MINUTE TO SECOND AS ELAPSED FROM dbc.dbqlogtbl

 

Viewing all 27759 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>