Quantcast
Channel: Teradata Forums - All forums
Viewing all 27759 articles
Browse latest View live

Location of SQL History in Studio - response (1) by ramesh.d

$
0
0

Hello Mahesh,
You need not manually do entries into history.  Is there an error displayed while you are trying to import the History XML? You could check the error log by selecting Window-> Show View -> Other Windows -> Error Log.
Procedure to Import:
1) Click on Import History tool bar button from Teradata SQL History View.
2) Browse the History XML and in the source types select "History (*.xml)" and click on OK.
If still things are working, can you zip the XML file and attached it here.
Ramesh.
 


SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition - response (9) by ericsun2

$
0
0

Hi Steven,
Thanks for the reply. Let me clarify my question:

  1. NET_ID defined as Decimal(38,0) in DDL, but what should I define in TPT file schema?
    • when MaxDecimalDigits = 18MaxDecimalDigits = 18MaxDecimalDigits = 18: shall I use VARCHAR(18), VARCHAR(19), VARCHAR(20) or VARCHAR(38)?
    • when MaxDecimalDigits = 38: shall I use VARCHAR(38), VARCHAR(39) or VARCHAR(40)?
  2. NET_NAME is defined as VARCHAR(300) CHAR SET UNICODE in DDL
    • when in UTF-8: I should use VARCHAR(900)
    • when in UTF-16: I should use VARCHAR(600)
    • when in ASCII: I should use VARCHAR(300), right?
  3. NET_KEY is defined as Decimal(9,0) in DDL, shall I use VARCHAR(9), VARCHAR(10) or VARCHAR(11)?
  4. If I have NET_TRAFFIC_BYTES defined as BIGINT in DDL, shall I use VARCHAR(20), VARCHAR(21) or VARCHAR(22)?

SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition - response (10) by feinholz

$
0
0

Well, without seeing the entire script, what you have shown me is a Selector operator retrieving 2 DECIMAL columns and a VARCHAR. Therefore, your schema must match.
 

Reg: Teradata Training - response (2) by gopaltera

$
0
0

Hi chowdary,
There is a good teradata institutes in hyd(BISP).I have completed td_course in Bisp inst through online provided by Manohar krishan.He explains very well and compare to outside world.He has been teaching both Teradata developement &DBA Sessions through webbased (gotomeeting apps).His providing the best material.
I recommended to him to learn teradata.
Contacts Details ..Manohar Krishna <sunmanu9@gmail.com> or 8499999111
 
Gud luck
 

Sybase To Teradata..... - response (2) by gopaltera

$
0
0

Great,we are using TPT(teradata parallel transporter ) to exports the data to file.
 
 

Re:Volatile Tables In Stored_Procedure... - forum topic by gopaltera

$
0
0

Hello Every one,
I have come across a problem in td_Stored Proc volatile tables,
Is  it volatile table  not allowed order by clause?
If it is, when i run the entire stored procedure and need to sort the results and here what is the soultions not alllowing of order by clause
 
~gopal

RODBC system error 193 - response (1) by ulrich

$
0
0

try 
http://forums.teradata.com/forum/analytics/connecting-to-teradata-in-r-via-the-teradatar-package
You need to download the JDBC driver and set the correct path.
Ulrich
 
 

Fast load Vs Mload when AMP is down - response (2) by cheeli

$
0
0

Thank you Dieter for your time on this.


Multi-Value compression has increased the table size - response (21) by amit.saxena0782

$
0
0

Hi Dieter,
 
I proposed a MVC to my client for Teradata 12 Tables. As per the analysis, I got around 800 GB approx saving on 2.3 TB of tables giving table level and column level savings. After too much of investigation on MVC, Client has come up with a concern as below:
 
Concern: Some of these columns are derived from bases that can change e.g. pricing strategies, cost price changes, tax (vat).
If any of these bases change the profile of the data in the tables will change, which means that a totally new set of ‘ideal’ compression values would apply.
How often would the compression values be reviewed?
 
As per my understanding , If the column values are more volatile for derived columns then we do not suggest applying the compression
But if columns values are more duplicate and static then apply the compress to save the space.  But on the whole ,  I am still confused that even if the columns are derived, but I somehow still got the savings for that table, around 30%-40 %.
Can you please advise , if there is a way, we can apply compression on tables with some some/all derived columns, as i can see much saving ..
 
Regards,
Amit

Teradata Training Material available - response (4) by msubbura

$
0
0

Hi Todd,
Could you please send them to my maild id mals.ece@gmail.com as well.
thanks,

Location of SQL History in Studio - response (3) by ramesh.d

$
0
0

Mahesh, looks like you have got java heap out of memory error. Could you increase the heap space by editing the TeradataStudio.ini from the path: C:\Program Files\Teradata\Client\14.00\Teradata Studio , 
and modify below lines,
-Xms1024m
-Xmx1024m
 

Temporal usage classic scenerio - forum topic by Qamar.Shahbaz

$
0
0

Hi
I need to track history of party addresses and created a temporal table for this.
CREATE MULTISET TABLE Employee_Addr_Hist (
Name varchar(100),
City varchar (100),
VT PERIOD(DATE) NOT NULL 
)
PRIMARY INDEX(name);
insert into Employee_Addr_Hist (name,city, Validity) values ( 'John','London' ,period( date '2011-01-01', until_changed) );
Now today(2013-03-19) I received a new row from source saying 'John' moved to 'Paris' on '2012-01-01'.
If I use below Update, CURRENT_DATE is used to close previous record and open new. Which is wrong as John moved to Paris on 2012-01-01.
update Employee_Addr_Hist
set  City = 'Paris'
where name = 'John'
and end(VT)  is until_changed
Can anyone help me in this? I just need to use source date for closing old record and opening new using Temporal features.
 
 
 

Tags: 
Forums: 

Temporal usage classic scenerio - response (1) by KS42982

$
0
0

You can add your SET statement like below -
SET VT = PERIOD(BEGIN(VT), source_date_column)
 

Performance considerations during taking back up-- - forum topic by Nishant.Bhardwaj

$
0
0

Hi Experts,
Need Your suggestions among the 2 possible scenerio through which we can take the backupof a  Huge table at production.
Scenerio 1-
Create table a_bkp as A with Data and Stats;  
 
Scenerio 2-
First Create empty table as create table A_bkp as A with no data .
second Use the merge statement to copy the data from main table to _bkp table
Merge statement in place of normal INsert  select ...
like -
Merge into _bkp
Using A
 
I had a discussion with one of my peer and he suggested to go with scernio 2 not with scenerio  1 as it scenerio 1 will run into spool issues at the production as table is really huge and MERGE dont take any Spool to process the record how ever Insert Select does.
 
As i am bit unsure ,Just wanted  to check this up Experts..
 
Thanks in advance.
 
Nishant 
 

Forums: 

How to retrieve the Relative Record Number? - response (9) by wicik

$
0
0

Hi there...
 
I have very similar problem like friend from the very start of the post.
I'm pretty nobbish in this type of SQL so please try not to be rough for me :)
 
Well.. I have to convert numeric field to date, than count some upload data and group by table.
Numeric_column like 20121121133549 which is probably like YYYYMMDDHHMMSS
Other columns are acces_point_type and data_uplink.
My goal is to convert numeric to Date (without time HHMMSS), count data_uplink and group by acces_point_type.
 
Simple group by would not be that problem but whole idea of convert data and connect it with casted date format and group by this,  is pretty dark magic for me.
Any help would be appreciated.
PS: Sorry for my terrible English :/
 


Re: remove the CRLF in the fastexport generated file - forum topic by balu_td

$
0
0

Hi,
I need to export file with certain fields. Where one of coulmn data has a CR.
For eg:
              cml4
 record1 aaa      bbb       ccc
 record2 ddd      eee       ffff 
The spaces between the aaa bbb represents the New Line Char..
Now i need to remove this and need to export the field. That export file should not contain CR.
please Help me!!

Forums: 

Re: remove the CRLF in the fastexport generated file - response (1) by balu_td

$
0
0

The Record may or may not contain CR. If it has don't how many CR it will be..
 

how to set up an alert in Viewpoint or other means, if DBS is reset on its own or manually done using TPARESET. - response (1) by ashikmh

$
0
0

Hello All,
Can any body please help me on this.
-Thanks

Conversion of numeric to date and group by it. - forum topic by wicik

$
0
0

Hi there...
I'm pretty nobbish in this type of SQL so please try not to be rough for me :)
 
Well.. I have to convert numeric field to date, than count some upload data and group by table.
Numeric_column like 20121121133549 which is probably like YYYYMMDDHHMMSS
Other columns are acces_point_type and data_uplink.
My goal is to convert numeric to Date (without time HHMMSS), count data_uplink and group by acces_point_type.
 
Simple group by would not be that problem but whole idea of convert data and connect it with casted date format and group by this,  is pretty dark magic for me.
Any help would be appreciated.
PS: Sorry for my terrible English :/
 

Forums: 

How to retrieve the Relative Record Number? - response (10) by wicik

$
0
0

Well...
 
Sounds stupid but I have helped myself with default select --> copy paste to excel --> sorting and numeric conversion to date via left and right functions (to split needed numbers and connecting it as a date) and by makeing a PivotChart from all of it.
 
It gave me pretty the same but not that pro as it should.
Still, any help would be welcome :)
I need to learn how to do it properly.
 
Regards

Viewing all 27759 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>