Hi.
We purchased HXTT JDBC Drivers for Access, .csv, .txt files about an year ago. We have developed a program which reads data from Microsoft Acess and loads it to Oracle after some manipulation. The problem occurs when we try to read from very big Access files (size about 2 GB and records about 3.5 to 5 million). The machine has about 1GB RAM, and we run the Java Program by setting max. heap size to about 700MB. When about 1.5 million records have been loaded we receive java.lang.OutOfMemoryError. I think the JDBC drivers load all the rows into the resultset though we have opened it as java.sql.ResultSet.TYPE_FORWARD_ONLY,java.sql.ResultSet.CONCUR_READ_ONLY
we donot need to update the resultset. We have set the fetchSize() to 50000, but even then we are receiving the same error. We are really in a big trouble as our loading process is halted due to these big files. Moreover, we can not afford to install more memory into our machines only for this loading prupose.
Waiting for your response anxiously.
Thanking you in anticipation.
Attiq Shahzad.
Diallog Broadband,
Islamabad, Pakistan.
|
>We purchased HXTT JDBC Drivers for Access, .csv, .txt files about an year ago.
First, you can download the latest versions. It seems that you haven't update your version in the past 8 months:)
>When about 1.5 million records have been loaded we receive java.lang.OutOfMemoryError.
>I think the JDBC drivers load all the rows into the resultset
>We have set the fetchSize() to 50000, but even then we are receiving the same
> error.
HXTT Access doesn't load all data into memory. What's your sql?
|