Quantcast
Channel: Teradata Forums - All forums
Viewing all 27759 articles
Browse latest View live

Table Partition - response (1) by Raja_KT

$
0
0

all-rows scan with a condition of ("(INFORM_USER.TABLEA.MO_ID =
201202) OR (INFORM_USER.TABLEA.MO_ID = 201201)") 
Cheers,
Raja


extending MLPPI tables - response (1) by Raja_KT

$
0
0

AFAIK, it is not possible since it is populated table. However, you can think of or try to do thus:
Note: to be 100% sure, you can take backup of entire table, because I dont know the version of TD you work on.
 

CREATE TABLE old_table1 ( /*create backup only for the one you want*/

key1 INTEGER NOT NULL,

....)

UNIQUE PRIMARY INDEX (key1);

 

 

 

ALTER TABLE table1 /*your table */

MODIFY

DROP RANGE#L2 BETWEEN DATE 'xxxxx'

AND DATE 'yyyyy'

ADD RANGE#L2 BETWEEN DATE 201401 

AND DATE 201412 

EACH 1....

WITH INSERT INTO old_table1;

 

Cheers,

Raja

extending MLPPI tables - response (2) by M.Saeed Khurram

$
0
0

HI,
You can use alter table statement to alter partitions on an empty table in any way, but there are certian limitations when the table is populated.
When the table is populated, You can add or drop ranges only at end. Can you please sepeicfy the error you are getting?
Further, as per you given alter table statement:
ALTER TABLE COD.SOME_TABLE_S_C12126                     
MODIFY PRIMARY INDEX (mstr_cust_ky)
ADD RANGE#L2 BETWEEN  201401 AND  201412 EACH 1;
This range has already been specified in create table statement. 
Please paste the error you are getting so that we can resolve it.
 

TPT Template operators with LDAP authentication - forum topic by tdforumuser

$
0
0

I have coded a simple TPT script using operator templates ($EXPORT, $INSERT and $LOAD) to copy data of Table A from TDSERVER-A to TDSERVER-B. TDSERVER-A and TDSERVER-B use LDAP based authentication. TPT (TBUILD) raises errors when I run the script either by initializing LogonMech, UserName, UserPassword variables for Source/Target operators in JobVars file  or when I run the script by overriding the mentioned variables using ATTR option inline in the TPT script. The error messages I noticed are given below. The strange thing is I was able to run the same script in a different TD env that does not use LDAP based authentication. Did anyone experience similar issue ? Are there any tweaks that should be done in the TPT script to use operator templates successfully with LDAP authentication? Any input is appreciated.  
Error Msgs:
Teradata Parallel Transporter Version 14.00.00.08
TPT_INFRA: TPT05014: RDBMS error 8017: The UserId, Password or Account is invalid.
TPT_INFRA: TPT04032: Error: Schema generation failed for table 'DBNAME.TABLENAME' in DBS 'T7DEV':
  "GetTableSchema" status: 48.
Job script preprocessing failed.
Job terminated with status 12.
 

Forums: 

Table Partition - response (2) by M.Saeed Khurram

$
0
0

Hi,
 
There are some points to notice about your query:
The first thing is you have a defined PI on (PLCY_NBR, MPD_ID) but you are using a partial value in your where clause using only MOD_ID. So when ever there is a partial index search, the optimizer will not be able to use PI. 
 
The second thing is you are using IN condition with two values, and it ("(INFORM_USER.TABLEA.MO_ID =
201202) OR (INFORM_USER.TABLEA.MO_ID = 201201)"), whenever there is a OR condition in a where clause it might lead to Full table scan.
So to avoid FTS, you can introduce SI on the columns being used in the where condition. define a USI, or NUSI on top of MOD_ID, and collect stats, hope it will help to avoid full table scan.
 
 

Fetching the data for thelast quarter dates - forum topic by bikky6

$
0
0

Hi
I am using TD13 version.I have a query which runs for on  particular dates.Whenever the query runs it has to pick up the last quarter dates.
 
How to extract last quarter dates in BETWEEN condition in TD without accessing calendar table.

Forums: 

Fetching the data for thelast quarter dates - response (2) by Raja_KT

Fetching the data for thelast quarter dates - response (3) by M.Saeed Khurram

$
0
0

Hi Bikky,
What if you try this?

WHERE COL_NAME BETWEEN CURRENT_DATE AND ADD_MONTHS(CURRENT_DATE,-3)

 


Teradata Dbase Error:5758 - response (4) by Raja_KT

$
0
0

Hi Swathi,

You can try and run the query and verify with data , step by step and make the changes in INSERT and see. I guess the update is fine.

 

Dont run the entire stuff, just run the query and verify with data, marking that you have left outer join with parent table, select distinct and lot of substring. I am not sure of the perforamnce.

 

Cheers,

Raja

When are secondary indexes required? - response (2) by Raja_KT

$
0
0

There may be cases where queries may not use PI. Then SI comes into the picture to enhance performance and chance of avoiding FTS.Value ordered NUSI is recommended for range queries.

They can be created and dropped anytime.

 

Understanding the business requirement and design, drives us to create SIs.

 

Cheers,

Raja

Inserting to a table incrementally - forum topic by bikky6

$
0
0

Hi
   What's the best approach to load the target table from the source incrementally.Let's say we have 1 million rows.I need to load the table for every 1 lakh.

Forums: 

Fetching the data for thelast quarter dates - response (4) by bikky6

$
0
0

Suppose let's take 1st quarter Jan-March.If my job runs in March.I need to pick the data from Oct1st to Dec31st.

Fetching the data for thelast quarter dates - response (5) by Raja_KT

$
0
0

An easier way is to use a scheduler. Maybe a stored proc too, since you can stuff your logic you want.

Cheers,

Raja

 

Inserting to a table incrementally - response (1) by Raja_KT

$
0
0

Hi Bikky,
Tools are always better. But if you dont have tools, then you have to use queries. You can think of this option,keep one table to track load based on identifiable field or fields.Use row_number() and qualify to restrict rows you want and insert the required data.
There may be other ways too.
You have not told the source is (table or file) or some othe DBs etc. 
Cheers,
Raja

arc error - forum topic by ZhangJiahao

$
0
0

hi all,when I resore table from backup tape.there are the following error msg.*** Failure 9011:Table being Archived/Restored/Copied was

dropped by some other transaction.How to handle it.Thanks.

Forums: 

Copy DB with ArcMain - response (2) by ZhangJiahao

$
0
0

I am faced with the same question.How to handle it.Thanks.

PMCP - forum topic by Raja_KT

$
0
0

Hi,
Is PMCP coming along with TD installation ? I am curious since I am not able to see PMCPINFO  views.
Thanks and regards,
Raja

Forums: 

arc error - response (1) by Raja_KT

$
0
0

Hi Zhang,
 

 
 
Explanation: The table that is being Archived/Restored/Copied has been dropped by some other transaction. This can happen when the HUT Lock is released using RELEASE LOCK statement allowing for Drop Table.
Generated By: ModifyTblHdr.
For Whom: End Users.
Remedy: This is a user error. User must not release the HUT LOCK and drop table while Arc/Restore/Copy is in progress.
Cheers,
Raja

 

Worker is Passive mode - response (1) by sheru4u

$
0
0

Hi Team,
 
I am using Teradata Aster Express 5.0. I downloaded the VM images from teradata Site. While trying to use my hands-on Aster; Queen is up but worker is not up like its hanging at the below step:
It is hanging during linux VM boot steps at "Starting nCluster services:". 
Please suggest what to do next. Quick response is highly appreciated.
 
Thanks,
Pramod

surrogate scripts - response (1) by Raja_KT

$
0
0

Hi Reddy,

 

There are few ways of generating surrogate, but can't put the script because of restrictions  of company's policies.

 

One way you can generate a sequence. 

 

Another way is create lookup table. Put the key fields which you are getting from source there along with sk_col. Join your source table with lookup table thru left outer join.

 

Now to get increment ,you need to select max of sk_col + row_number() over key fields.......

 

Script I can't share. 

 

Cheers,

Raja

Viewing all 27759 articles
Browse latest View live