Quantcast
Channel: Teradata Forums - All forums
Viewing all 27759 articles
Browse latest View live

SQL help - response (23) by Adeel Chaudhry

$
0
0

Is the table SET or MULTISET? You used INSERT .... SELECT for inserton?
 
Can be due to full row duplicate getting suppressed.


Table Stats - response (1) by Adeel Chaudhry

$
0
0

You need to select which columns would be used frequently in JOINs and then you need to collect stats using below command:
 
COLLECT STATISTICS ON <Database Name> COLUMN (<Column Name>);
 

Table Stats - response (2) by Adeel Chaudhry

$
0
0

Also .... in case you have some queries handy .... which will be frequently executed .... you can execute following command in a session:
 
DIAGNOSTIC HELPSTATS ON FOR SESSION;
 
And then get the EXPLAIN (F6 key) for getting the explain of the query and it will suggest you some stats to collect. You do need to execute those statements to be able to be in affect.
 
Also note, this command only gives you a hint of stats to be collected. Best case may not be as same as the suggestions.
 
HTH!

Best way to generate the Sequential numbers : CSUM or IDENTITY columns? - response (10) by Adeel Chaudhry

$
0
0

Reddy, just replied to your post .... please have a look.

FYP ideas - response (5) by Adeel Chaudhry

$
0
0

What will be the duration in which you can work? 12 months?
 
I would suggest you to first finalize the team .... single or more .... but to finalize and start thinking/working like a team so the idea is of mutual interest. :)
 
Will you guys be touching the front-end [reporting] or only the back-end [database]?

TPT 14.10 output to named pipe and then gzip to final files - response (19) by ericsun2

$
0
0

Validated.

  • 14.10.00.02 has the "GetPos" issue even after removing "-z 60" option
  • 14.10.00.03 works fine without "-z 60" option

The multiple gzip named pipes work again now.
Thanks, Steven.

SQL help - response (24) by mailtodvyag@gmail.com

Teradata with SSIS - response (5) by flash7gordon

$
0
0

jgreenwood,   I'm not sure if you're talking about td server 14.x or td 14.x client driver install.   We don't have server 14.x in our environment  yet so I can't help you there.   Here were my results with td client 14.x. For SSIS I was on a ms sql 2008 install and hence SSIS 2008 on a win 2008 server machine with a brand new (2104-02) 14.x teradata client install.   I did what I described above and it worked for SSIS access.  I also want to report that a dot net c# compile with teradata ldap access compiled on a  laptop with a td 13.x client ran on the ms sql 2008 server with td client 14.x .  
One thing I would check if you are having problems with ssis connecting is what kind of authentication mechanism your teradata environment is using.   The instructions above will only work for ldap. The other types I think are TD2 and SPNEGO.  We poor slobs who are just users of huge corporate warehouses aren't always informed of what the authentication method is so sometimes you have to ask.


TPT 14.10 output to named pipe and then gzip to final files - response (20) by feinholz

$
0
0

Thanks for the validation (and your patience).

Need Header Row in FastExport / TPT Export.? - response (7) by feinholz

$
0
0

TPT cannot load data into any database other than Teradata.
(You can use TPT to move data from a non-Teradata database to Teradata without landing the data to disk.)
However, TPT is a bulk data loading/unloading suite of tools.
We do not include column headers (that functionality is best left to a report writing tool).

error: The format or data contains a bad character - response (3) by skchintha

$
0
0

Thanks for reply,.
I am new to SQL.
 
I used cast function but its throwing error:
 
select    a18.Prod_ClassID  Prod_ClassID,
        sum(a11.EXTENDEDPRICE)  WJXBFS1
    from    REPORTING_V.TransactionLine    a11
        join    REPORTING_V.CALENDAR    a12
          on     (a11.BUSINESSDAYID = a12.BUSINESSDAYID)
        join    REPORTING_V.CALENDAR    a13
          on     (a12.FiscalWeekID =CAST(substr(a13.FiscalWeekID,1,4) AS INTEGER) -1) ||  cast(substr(a13.FiscalWeekID,5,2) as integer))
        join    REPORTING_V.CALENDARWEEK    a14
          on     (a13.CalendarWeek = a14.CalendarWeek)
        join    REPORTING_V.CALENDARMONTH    a15
          on     (a14.CalendarMonth = a15.CalendarMonth)
        join    REPORTING_V.CALENDARQUARTER    a16
          on     (a15.CalendarQuarter = a16.CalendarQuarter)
        join    REPORTING_V.PRODUCT    a17
          on     (a11.ITEMID = a17.ITEMID)
        join    REPORTING_V.PROD_SUBCLASS    a18
          on     (a17.Prod_ClassID = a18.Prod_ClassID and
        a17.Prod_SubClassID = a18.Prod_SubClassID)
    where    (a16.CalendarYear in (2013)
     and a11.BUSINESSUNITGROUPID in (1, 5, 6, 7, 9)
     and a13.FiscalWeekID in (201335))
    group by    a18.Prod_ClassID
 
 
error :expected some thing between ')' and '||'.
 
If i remove second cast function its working fine.
are  this joins are correct? Actually it should bring me last year week but its not returning any data.
 
Please let me know
Tnxs.
 

SSIS and Global Temp Tables - response (3) by Richard34

$
0
0

Thanks Adeel for your advice, and thanks Saeed for your help too.  I was able to create a permanent table in EDW, and that solved it. You guys really rock! 

Delete table from within SSIS - forum topic by Richard34

$
0
0

Hello everyone.  I am looking to delete all data in a permanent table in EDW from within an SSIS package.  I have an OLE DB connection to Teradata.  Do I need to create a Stored Procedure in Teradata, and call it from within the package.  Thanks

Forums: 

Backlog issue occuring when UNION ALLing multiple TPT DataConnector operators into Stream - forum topic by abiffle

$
0
0

Hi all,
 
I am encountering an unexpected situation when using multiple DataConnector operators UNION ALLed together, and would appreciate your input/suggestions.
 
I have a process that uses TPT DataConnector and TPT Stream to load files that are being continuously received.  The process is been working fine, but requires two instances of the DataConnector operator in order to prevent file read from being a bottleneck.
 
I now have the requirement of loading the files in order received.  I can accomplish this through the VigilSortField property of DataConnector, but I cannot use multiple instances when using this property.
 
In order to get around this limitation, I was hoping to achieve the same parallelism by using two distinct DataConnector operators, each processing half the files and UNION ALLing into one Stream operator.  This approach seemed to fit my situation well, since two files with different file patterns are received every 5 seconds.
 
However, I am running into an issue because one of the two files is consistently larger.  The DataConnector operator processing the larger files is consistently falling behind, and is unable to ever process its backlog.  
 
Based on the feinholz quote below, I understand that the balancing between the two operators is based on file size rather than file count, so I believe it is expected that the operator processing the larger files would process fewer files per checkpoint.

"When using multiple instances to read from multiple files, we load balance the files across the instances according to the file sizes."
http://forums.teradata.com/forum/tools/tpt-instances-how-this-works

 
However, my scenario in which files are constantly received is creating a situation in which this "larger-file" operator is never able process its backlog.  I would guess that this is occurring because whenever either DataConnecter finishes processing all files detected during its directory scan, it causes both operators to checkpoint.
 
For example:

  • Directory Scan:  DataConnector Small notices X files, DataConnector Large notices X files
  • Select Phase: DataConnector Small processes X files, DataConnector Large processes (X - R) files, leaving a remainder of R files unprocessed due to filesize-based balancing
  • Checkpoint (triggered by DataConnector Small finishing its directory scan list, even though DataConnector Large still has R files it could process)
  • --
  • Directory Scan:  DataConnector Small notices Y files, DataConnector Large notices Y + R files
  • etc -- R grows unbounded until VigilMaxFiles is exceeded and the process aborts

 
Can anyone suggest a way to resolve this issue?  Is there any setting that will change the balancing to file count rather than file size, or prevent the checkpoint from occurring until both DataConnector operators have processed their entire directory scan?
(I am using TPT 14.0, VigilMaxFiles of max of 50,000, VigilWaitTime of 1.  No -l latency_interval set on tbuild command.)
 
Thanks,
Adam

 

Forums: 

Keep getting StatusCode="2802",ErrorMessage="SQLState =23505" after delete and reinsert - forum topic by zhiyanshao

$
0
0

Hi, I was wondering if there is a cach or anything. After I clear the table using "Delete from" and reinsert the same set of rows, I often get the error on totally unique record. I was wondering if there was any cache or anything.
 
CREATE MULTISET TABLE SANDBOX.MERCHANT_SETTLEMENT ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      settlement_id INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY
           (START WITH 1 
            INCREMENT BY 1 
            MINVALUE -2147483647 
            MAXVALUE 2147483647 
            NO CYCLE),
      booking_num INTEGER NOT NULL,
      settlement_source INTEGER NOT NULL,
....
      CONSTRAINT MRCHSET_SOURCE_ENUM CHECK ( settlement_source =  1  ), 
CONSTRAINT MERCH_SETTLEMENT_PK PRIMARY KEY ( settlement_id ))
PRIMARY INDEX ( booking_num );

Forums: 

CHAR(n) CHAR SET UNICODE/LATIN define schema in 14.xx - multiply by 2 or 3 - forum topic by ericsun2

$
0
0

In TPT script that "USING CHARACTER SET UTF8" in 13.10
We multiply char_length() of a CHAR & VARCHAR column by 3, so for example,

  • CHAR(2) in DDL = CHAR(6) in TPT
  • VARCHAR(20) in DDL = VARCHAR(60) in TPT

It seems that in TPT 14.00 and 14.10, if we try to use UTF8, the following definition works

  • CHAR(2) CHAR SET UNICODE in DDL = CHAR(6) in TPT
  • CHAR(2) CHAR SET LATIN in DDL = CHAR(4) in TPT
  • VARCHAR(20) CHAR SET UNICODE in DDL = VARCHAR(60) in TPT
  • VARCHAR(20) CHAR SET LATIN in DDL = VARCHAR(60) in TPT

The following DDL will only work with the following TPT schema definition:

 CREATE SET TABLE tpt_data_type ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      PRODUCT_ID INTEGER NOT NULL,
      ORDER_ID BIGINT NOT NULL,
      ORDER_DESC VARCHAR(100) CHARACTER SET UNICODE NOT CASESPECIFIC,
      STORE_CODE VARCHAR(20) CHARACTER SET LATIN NOT CASESPECIFIC,
      ACTIVE_FLAG CHAR(1) CHARACTER SET LATIN NOT CASESPECIFIC,
      UNICODE_FLAG CHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC,
      ORDER_AMT DECIMAL(15,2),
      ORDER_TIME TIMESTAMP(0),
      ORDER_STATE CHAR(1) CHARACTER SET LATIN NOT CASESPECIFIC,
      UPDATED_TS TIMESTAMP(0))
PRIMARY INDEX ( ORDER_ID ); 
USING CHARACTER SET UTF8 
DEFINE JOB EXPORT_TO_FASTLOAD_FORMAT
DESCRIPTION 'Export from ' || @SourceTableName || ' to the INDICDATA file: ' || @DataFileName
(
DEFINE SCHEMA DATA_FILE_SCHEMA
(
"PRODUCT_ID" Int,
"ORDER_ID" BigInt,
"ORDER_DESC" Varchar(300),	/* 100 x 3 */
"STORE_CODE" Varchar(60),	/* 20 x 3 */
"ACTIVE_FLAG" Char(2),		/* 1 x 2 */
"UNICODE_FLAG" Char(3),         /* 1 x 3 */
"ORDER_AMT" Decimal(15,2),
"ORDER_TIME" Timestamp(0),
"ORDER_STATE" Char(2),		/* 1 x 2 */
"UPDATED_TS" Timestamp(0)
);

  DEFINE OPERATOR EXPORT_OPERATOR
  TYPE EXPORT
  SCHEMA DATA_FILE_SCHEMA
  ATTRIBUTES
  (
  VARCHAR PrivateLogName    = @SourceTableName || '_log',
  VARCHAR TdpId             = @TdpId,
  VARCHAR UserName          = @UserName,
  VARCHAR UserPassword      = @UserPassword,
  VARCHAR QueryBandSessInfo = 'Action=TPT_EXPORT; Format=Fastload;',
  VARCHAR SpoolMode         = 'noSpool',
  INTEGER MaxDecimalDigits  = 18,
  VARCHAR DateForm          = 'INTEGERDATE',
  VARCHAR SelectStmt        = 'select * from ' || @SourceTableName  
  );

  DEFINE OPERATOR FILE_WRITER
  TYPE DATACONNECTOR CONSUMER
  SCHEMA *
  ATTRIBUTES
  (
  VARCHAR PrivateLogName   = 'indicdata_writor_log',
  VARCHAR DirectoryPath    = @DataFilePath,
  VARCHAR FileName         = @DataFileName,
  VARCHAR Format           = 'Formatted',
  VARCHAR OpenMode         = 'Write',
  VARCHAR IndicatorMode    = 'Y'
  );
  
  APPLY TO OPERATOR (FILE_WRITER[@DataFileCount])
  SELECT * FROM OPERATOR (EXPORT_OPERATOR[@NumOfReader]);
);
$ hexdump -C /var/tmp/tpt_data_type.fastload 
00000000  64 00 00 00 12 27 00 00  15 cd 5b 07 00 00 00 00  |d....'...�[.....|
00000010  19 00 44 55 4d 4d 59 20  44 45 53 43 20 e6 95 99  |..DUMMY DESC �..|
00000020  e5 ad a6 e8 ae be e8 ae  a1 20 21 04 00 53 45 41  |学设计 !..SEA|
00000030  32 59 20 e5 85 a8 bf 09  00 00 00 00 00 00 32 30  |2Y �.��.......20|
00000040  31 34 2d 30 32 2d 31 32  20 31 32 3a 31 32 3a 32  |14-02-12 12:12:2|
00000050  32 50 20 32 30 31 34 2d  30 32 2d 31 39 20 31 37  |2P 2014-02-19 17|
00000060  3a 32 37 3a 35 37 0a 64  00 00 00 19 27 00 00 45  |:27:57.d....'..E|
00000070  ef 54 07 00 00 00 00 19  00 44 55 4d 4d 59 20 44  |�T.......DUMMY D|
00000080  45 53 43 20 e5 af bb e6  89 be e9 87 91 e6 98 9f  |ESC 寻�.��..�..|
00000090  20 21 04 00 4c 41 58 33  4e 20 e5 85 8d 83 13 00  | !..LAX3N �.....|
000000a0  00 00 00 00 00 32 30 31  34 2d 30 31 2d 31 31 20  |.....2014-01-11 |
000000b0  31 31 3a 31 31 3a 31 31  57 20 32 30 31 34 2d 30  |11:11:11W 2014-0|
000000c0  32 2d 31 39 20 31 37 3a  32 37 3a 35 37 0a        |2-19 17:27:57.|
000000ce

The CHAR(n) CHAR SET LATIN behaves differently than previous version. Can someone please confirm?
 

Forums: 

fastload on linux - forum topic by rmanda01

$
0
0

Hi
I have data files created on AIX-UNIX using fexp INDICATORS,  moved files to a linux server.  Fload the data files to tables fails with error on linux : 
 
**** 18:30:39 I/O Error on File Checkpoint: 42, Text: Unable to obtain
              data signature !ERROR! Unexpected data format
 
 FASTLOAD UTILITY     VERSION 14.00.00.003   PLATFORM LINUX
 any help ?
 

Forums: 

Teradata with SSIS - response (6) by flash7gordon

$
0
0

jgreenwood, TD2 is the native Teradata security.  If it doesn't take too long just try what you did before but put TD2 for the Auth Mechanism rather than LDAP.

ODBC client on windows - forum topic by ricky_ru

$
0
0

I am going to be crazy by installing the ODBC driver on windows.
1) I downloaded the version tdodbc__windows_i386.13.10.00.10.zip from here:http://downloads.teradata.com/download/connectivity/odbc-driver/windows
2) Installed to follow the right order. icu->gss->odbc
3) At the end of the software, I am able to have the chance to launch the odbc onfig ui and readme. And I could config a odbc DSN.
4) But the problem is that when I want to add new dsn, there is no teradata item in the list to be selected. And I could not remove/config the previous one in step 3 as there is no driver found/not registered issue shown.
5) It looks to me that in step 3, the driver information is in memory. But it could not write to the system register or other files.

I tried to use admin to install and modify the register table permission. But still got the problem. Also, it does not  have the port number field and no test button.

Do you have any idea?

Thanks,
Ricky

Forums: 

ODBC client on windows - response (1) by Adeel Chaudhry

$
0
0

Is the DSN created? Is it a 64-bit Windows? Which ODBC are you checking .... ODBC 64-bit or ODBC 32-bit? Which tool are you trying to connect with .... Teradata Admin? SQL Assistant?

Viewing all 27759 articles
Browse latest View live




Latest Images