Hi,
I have 1,40,000 rows in my CSV file and I need to import into Volatile table,
this is my code
Creating table:
Create Volatile table Employee_Report(F1 Varchar(10), F2 Varchar(10),F3 Varchar(10)) ON COMMIT PRESERVE ROWS
Importing data into table:
Import vartext ',' File=c:\\Employee.CSV;
.REPEAT *;
USING F1 (Varchar(10)), F2 (Varchar(10)),F3 (Varchar(10))
INSERT INTO Employee_Report Values(:F1,:F2,:F3);
Issue:
There is no response in BTEQ screen for hours and i need to close BTEQ forcefully
Please guide me, is BTEQ handles large amount of data?
↧