Main   Products   Offshore Outsourcing   Customers   Partners   ContactUs  
JDBC Databases
  HXTT Access v7.1
  HXTT Cobol v5.0
  HXTT DBF v7.1
 
  Buy Now
  Support
  Download
  Document
  FAQ
  HXTT Excel v6.1
  HXTT Json v1.0
  HXTT Paradox v7.1
  HXTT PDF v2.0
  HXTT Text(CSV) v7.1
  HXTT Word v1.1
  HXTT XML v4.0
Offshore Outsourcing
Free Resources
  Firewall Tunneling
  Search Indexing Robot
  Conditional Compilation
  Password Recovery for MS Access
  Password Recovery for Corel Paradox
  Checksum Tool for MD5
  Character Set Converter
  Pyramid - Poker of ZYH
   
   
   
Heng Xing Tian Tai Lab of Xi'an City (abbr, HXTT)

HXTT DBF
DBF ResultSet
Tom Hebbron
2006-11-20 09:18:12
I am having a problem using the DBF library.

I am generating XML from a number of tables (one table at a time). When dealing with a relatively large table (6296 rows) the while(rs.next()) loop works OK until row 3207 and then dies with

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

I had initially planned to fix this problem by using LIMIT and OFFSET to chunk the table and process one block at a time, but it seems that the DBF JDBC driver does not support this SQL syntax - how can I fix this?
Re:DBF ResultSet
HXTT Support
2006-11-20 16:35:16
>When dealing with a relatively large table (6296 rows) the
>loop works OK until row 3207 and then dies with
6292 rows is a very small table.

>Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>I had initially planned to fix this problem by using LIMIT and OFFSET
You needn't to do that. The reason is that you should have an invalid lob information in yourtable.FPT(DBT) at about 3207 rows. You can try the latest package. If you failed still, you can zip and upload your database sample into:
ftp site: ftp.hxtt.com
ftp user: anonymous@hxtt.com
ftp password: (empty)
login mode: normal (not anonymous)
ftp port:21
upload directory: incoming
transer mode: binary (not ASCII)
After upload, you can't see that upload file, but it has been upload.

Then we will check and fix that invalid lob information for you.

>I had initially planned to fix this problem by using LIMIT and OFFSET to chunk
> the table and process one block at a time,
You can use select * from yourtable where _rowid_ between 1001 and 2000, but I don't think that can fix your issue. In fact, HXTT DBF can do "select * from yourtable" for million level table, because it never load all rows into memory. You can use Statement.setFetchSize() to change the row cache size.

Search Key   Search by Last 50 Questions




Google
 

Email: webmaster@hxtt.com
Copyright © 2003-2019 Heng Xing Tian Tai Lab of Xi'an City. | All Rights Reserved. | Privacy | Legal | Sitemap