A Whole Heap of Issues

I've retrieved a DataSet which I traverse. I use an external java program to contact the Server

When I've got around 30 records I'm getting a java Heap Out of Memory exception, now I've either got a memory leak or I'm just plain low on memory.

Granted I am using a large 3rd Party library Running a dev database and an all singing all dancing ide on a celeron, but I've plans to expand, is there a way of 'chunking/streaming' the data into bite-size portions to be managed by my external program?

Ok , I got a lot going on but I'm hardly pushing stuff to the limits here.

So whats the deal, is the dataaccess layer non scalable.

Are there any anti-patterns that cause bottlenecks memory leaks.

Is the dataset a particularly heavyweight object?
Likewise are there patterns to chunk or stream data to make it more scalable?

Some quick additional points:

In the tutorials the closing of a session should be in a finally block no?

Regards

Greg

A Whole Heap of Issues

Hi Greg,

Regarding the memory issues, we actually have two FAQ's

http//www.screen-scraper.com/support/faq/faq.php#Optimizing
http//www.screen-scraper.com/support/faq/faq.php#Database

as well as a tutorial

http//www.screen-scraper.com/support/tutorials/tutorial5/tutorial_overview.php

that you'd probably find helpful.

You're probably right that closing up a session would probably best be done in a finally block, just in case an exception were thrown.

Kind regards,

Todd