I am having a problem using the DBF library.
I am generating XML from a number of tables (one table at a time). When dealing with a relatively large table (6296 rows) the while(rs.next()) loop works OK until row 3207 and then dies with
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
I had initially planned to fix this problem by using LIMIT and OFFSET to chunk the table and process one block at a time, but it seems that the DBF JDBC driver does not support this SQL syntax - how can I fix this?
|
>When dealing with a relatively large table (6296 rows) the
>loop works OK until row 3207 and then dies with
6292 rows is a very small table.
>Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>I had initially planned to fix this problem by using LIMIT and OFFSET
You needn't to do that. The reason is that you should have an invalid lob information in yourtable.FPT(DBT) at about 3207 rows. You can try the latest package. If you failed still, you can zip and upload your database sample into:
ftp site: ftp.hxtt.com
ftp user: anonymous@hxtt.com
ftp password: (empty)
login mode: normal (not anonymous)
ftp port:21
upload directory: incoming
transer mode: binary (not ASCII)
After upload, you can't see that upload file, but it has been upload.
Then we will check and fix that invalid lob information for you.
>I had initially planned to fix this problem by using LIMIT and OFFSET to chunk
> the table and process one block at a time,
You can use select * from yourtable where _rowid_ between 1001 and 2000, but I don't think that can fix your issue. In fact, HXTT DBF can do "select * from yourtable" for million level table, because it never load all rows into memory. You can use Statement.setFetchSize() to change the row cache size.
|