Monitor Index Creation - response (1) by dnoeth
Impact of unnecessary compress on all char and varchar columns. - response (2) by dnoeth
Adding to #3:
DefaultCaseSpec is only for string comparisons in Teradata mode sessions, but the default for a new column is based on the session mode,always NOT CASESPECIFIC in a Teradata session, but CASESPECIFIC in ANSI.
Beside the global DefaultCharacterSet there's also a user level DEFAULT CHARACTER SET to set Latin/Unicode as default.
Monitor Index Creation - response (2) by arpitsinha
Thanks Dieter
Impact of unnecessary compress on all char and varchar columns. - response (3) by CarlosAL
Dieter:
I'm not sure what you meant with "always NOT CASESPECIFIC in a Teradata session".
In Teradata session mode (which is the 'usual' mode, an the mode I was assuming for the op) the DefaultCaseSpec rules the CASESPECIFIC mode of the new created columns.
If DefaultCaseSpec = TRUE in dbscontrol and you create a table with character columns defaulted, this columns will be CASESPECIFIC.
Cheers.
Carlos.
Parsing CompressValueList column from DBC.Columns - forum topic by k_jalev
Hello,
I need to take all values on which compression is defined for a particular column. I do it by parsing the value of CompressValueList column from DBC.Columns, but I noticed, that in some special cases(happily, it is not a real case for me till now) the value of the column is written in a way, which confuses the parsing. For example:
COMPRESS ('A'','',M','B','CD')
Should compress these 3 values:
A',',M
B
CD
CD
But what appears in the column is:
('A',',M','B','CD')
Which is totally the same as if I specify:
COMPRESS ('A',',M','B','CD')
Which totally confuses my parsing logic
I suppose that when the DB is composing this string - the single quotes are not escaped
I searched for a workaround, but with no success, is there any solution or explanation on that?
Impact of unnecessary compress on all char and varchar columns. - response (4) by dnoeth
Hi Carlos,
of course you're correct, DefaultCaseSpec also controls the default for new columns (I was under the impression this is only for comparisons, my fault).
Parsing CompressValueList column from DBC.Columns - response (1) by dnoeth
SHOW TABLE returns correct data, so at least it's compressing the correct values :)
I don't think there's a workaround, so you might open an incident to get this fixed...
Parsing CompressValueList column from DBC.Columns - response (2) by k_jalev
Thank you, glad to hear this from someone else:)
And yes, the table definition is OK
Bteq script to read and execute query from a table - response (10) by ANUBHA
Hi Dnoeth,
Is there any other apporach apart from exporting the column value 'select...' and execute it after importing?
My Requirement is similar
ID Query1
1 select * from t1;
2 select * from t2;
I need to execute the values of 'Query1' column in bteq with out exporting the data. Is there any approach?
script for ODBC connection - forum topic by PavelDriver
Hello.
Often it is necessary to create a connection to the Teradata database and want to automate this process.
Use teradata ODBC Driver 13.10.00.01
I tried to bat files through the following:
% WINDIR% \ System32 \ odbcconf.exe configsysdsn "Teradata""DSN = Teradata1 | Description = Teradata | DBCName = Teradata"
Gives an error message:
CONFIGSYSDSN: Unable to create a data source for the 'Teradata' driver:
Component not found in the registry with error code -2147467259
Advise how you can create?
Thanks.
Connecting Python to Teradata over ODBC - response (17) by Roopalini
Thanks for sharing the link Eric. I shall go over it and try to get that installed.
Regarding Certification Exams - Pass % - response (26) by pa280602
Hi,
Recently I gave teradata 14 basics exam. I haven't received the results yet.
Can anyone tell me what is the exact passing percentage required for Teradata 14 basics exam.
how to convert UTF8 Codeset to ISO 8859 codeset conversion - forum topic by vuyyuru
Hi,
want to conver the codset of the col1 to iso 8899 codeset
SEL
CASE
WHEN DB.COL1 IS NOT NULL THEN '' || DB.COL1
ELSE ''
END
FROM TABLEA
how to rewrite this one? help me
Regards,
Rams
TPT UTF8 Import of "no valid unicode character" - need help - response (8) by david.craig
An import of U+10002A is uncommon as it is a user-defined character in the Supplementary Private Use Area-B. It could also be a corrupted encoding. Private use has been used by Japanese communications companies to encode Emoji.
BIDS can't find Tdata via Attunity - forum topic by purpleintrepid
Hello.
I'm trying to get my SSIS environment connecting to Teradata after installing some Oracle tools, and it's not going well. I've reinstalled Attunity's drivers, the Teradata tools, and the Teradata ODBC and .NET drivers, and my BIDS environment still says No connectivity to Teradata is available.
What else do I need to check or reinstall?
Thanks for any help you can provide,
-Beth
BIDS can't find Tdata via Attunity - response (1) by purpleintrepid
found the solution on StackOverflow, I was installing the latest version (3.0) of the Attunity driver but using Visual Studio 2010, so I needed to install 2.0 instead. Installing the older version worked.
Volatile Tables in Macros - response (3) by flash7gordon
Note to Teradata product people. What a joke. This guy wants to use some temp tables in a macro to get to a result and he's done that a 1000 times before in ms sql and now he's being told he can't have a single temp table with dml in a teradata macro. I'm pretty sure successive temp table work in Oracle too. My boss used temp tables to make our company what it is, and it's a company you've heard of. So get with the program teradata and get your parsers into the 20th century (I won't ask for the 21st) with your next release. I won't even talk about my problem which is how to get a dot.net call to teradata to use a temp table when each invocation of Teradata is a new session.
Aggregation step taking time Teradata Explain Plan - forum topic by spirosmike
Hi
I have a set of SQLs that run a long amount of time and CPU all all of these are taking time at the aggregation step that explains something like below.Stats are collected on obvious joining columns but not sure if they are sufficient.
Can someone please let me know what happens in an aggregations step and what conclusions we can take when this happens to try and resolve this.
Any ideas/thoughts are highly appreciated.
Thanks
We do an all-AMPs SUM step to aggregate from Spool 17 (Last Use)
by way of an all-rows scan , grouping by field1 (
A
,B
,C
,D
,E
,F
,G
). Aggregate
Intermediate Results are computed globally, then placed in Spool
28. The aggregate spool file will not be cached in memory. The
size of Spool 28 is estimated with low confidence to be
762,227,092 rows (645,606,346,924 bytes). The estimated time for
this step is 31 minutes and 4 seconds.
utf8TO16 : why is it Latin to unicode conversion function when utf8 itself is unicode encoding? - response (3) by vuyyuru
Hi Dieter,
can you please help in converting data from UTF8 to LATIN1_0A or ISO8859_1
tried using this below query but i am getting error the string contains untranslatable character. In the same way will have to TRANSLATE(COL1 USING UTF8_TO_LATIN1_OA) AS COL1 it says that unknown character sting.
can we do this by writing sql query or not if so how? advice me.
SEL
TRANSLATE(COL1 USING UNICODE_TO_LATIN) AS COL1
FROM
(
SEL
CASE
WHEN DB.COL1 IS NOT NULL THEN '' || DB.COL1
ELSE ''
END
FROM TABLEA
) DT COL1
Aggregation step taking time Teradata Explain Plan - response (1) by dnoeth
If there's no GROUP BY in your SELECT it's probably a DISTINCT rewritten by the optimier as an aggragtion.
If the estimated numbers are correct, this step might run for a long time, did you check dbc.QryLogStepsV if this step actually uses lots of resources?
Hi Arpit,
afaik there's no direct way.
If you query the current size of the table using dbc.TableSizeV before running the create index and calculate the expected size of that index you can see the increase when you query dbc.TableSizeV again during the create.
If you got access to Ferret you could also run SHOWBLOCKS on that index to see the growth.