Hi
I have some code below to read data from a file of about 17000 records. it takes approx 5 seconds to load. How can this be made faster. Is that possible?. See the below code.
-----------start of code -------------------------
var ridx,actualrecords:integer;
F1,F2,F3,F4,F5:TField;
begin
{Populate the search grid}
LoadProgressBar.Position:=0;
LoadProgressbar.Visible:=True;
LoadProgressBar.Min:=0;
LoadProgressBar.Max:=datafrm.tbProduct.RecordCount;
SGSearch.FixedRows:=1;
NullStrictConvert := false;
// set the number of rows in the string grid
ProductEditFrm.SGSearch.RowCount:=datafrm.tbProduct.RecordCount;
for ridx := 1 to datafrm.tbProduct.RecordCount - 1 do ProductEditFrm.SGSearch.Rows[ridx].Clear();
ProductEditFrm.SGSearch.colwidths[0]:=60;
ProductEditFrm.SGSearch.colwidths[1]:=170;
ProductEditFrm.SGSearch.colwidths[2]:=85;
ProductEditFrm.SGSearch.colwidths[3]:=0;
ProductEditFrm.SGSearch.colwidths[4]:=0;
// assign the field names
datafrm.tbProduct.first;
F1:=DataFrm.tbProduct.FieldByName('PRODUCT_SELLING_CODE');
F2:=DataFrm.tbProduct.FieldByName('PRODUCT_DESCRIPTION_ONE');
F3:=DataFrm.tbProduct.FieldByName('PRODUCT_SELLING_BARCODE');
F4:=DataFrm.tbProduct.FieldByName('PRODUCT_CATEGORY_ID');
F5:=DataFrm.tbProduct.FieldByName('PRODUCT_SUPPLIER_NUMBER_SELLING');
// assign the values to strig grid
ridx:=1;
datafrm.tbProduct.DisableControls;
ProductEditFrm.SGSearch.BeginUpdate;
try
While not (datafrm.tbProduct.eof) do
begin
actualrecords:=actualrecords+1;
Begin
SGSearch.Cells[0,ridx]:=F1.Value;
SGSearch.Cells[1,ridx]:=F2.Value;
SGSearch.Cells[2,ridx]:=F3.Value;
SGSearch.Cells[3,ridx]:=F4.Value;
SGSearch.Cells[4,ridx]:=F5.Value;
ridx:=ridx+1;
end;
datafrm.tbProduct.Next;
LoadProgressBar.Position:=actualrecords;
end;
finally
ProductEditFrm.SGSearch.EndUpdate;
end;
datafrm.tbProduct.EnableControls;
SearchData.SetFocus;
---------------end of code------------------------
Thank you
It would be important to separate dataset performance from grid performance.
So, did you measure effects from your dataset?
Hello Bruno
You are right there are some database considerations of iterating through the file and probably
that is causing it to take more time but only 17000 records is not a lot of data. (win 7 32 bit, 3Ghz Cpu)
So if the database access to fill the string grid is slowing it down then perhaps i could use another way.
What if the database is converted to a text file e.g. CSV file
Is filling a string grid from a text source going to be any faster ?
Thanks
Kamran
As I demonstrated with the code, loading 17000 rows in the grids takes approx. 100ms.
The question with respect to your project is how fast you can iterate through your dataset. I'd suggest to first separately measure this and check if there can be optimizations done with respect to iterating through the dataset.
Also, did you measure time of the loop of iteration and time needed before this iteration starts?
Finally, I see in your code something like:
Hi Bruno
1. I just changed my code to grid.clear
does not change the time it takes still takes 5 seconds
2. I removed the loadprogressbar
Good news:
It now loads in approx 2/3 seconds so we have some improvement.
I will not use the TAdvprogress bar any more in my code because speed of loading is more important.
3. The assignment of data from the database field to the string grid.
I do not know of any other faster way.
thanks
Kamran
- Are you using the latest version of the components? There was a performance issue in an older version of TAdvProgressBar but we are not aware there is one in the latest version.2) Again, as code shows without dataset, just assigning of 17000x5 cells takes only 100ms, so this can't be the reason for the 2/3 sec performance you see.3) If you fill all 17000 rows anyway, I'm not sure why you need to call grid.Clear, filling all cells overwrites the cell data anyway.
- using version 1.2.0.0 of TAdvProgress as per properties information
2) I will remove the grid.clear ( just thought it was better idea to clear it)
thank you for your help
kamran
Hi Bruno
I just loaded the data using SGSearch.LoadFromCSV('c:\pcs\productdata.csv',17000);
and the data is almost instant loading from the .csv file (approx 1 second) so I think I must go for the .csv solution until I can find a faster way to load from the database file. The user is most demanding and wants the information instantly and does not like waiting .. just the usual demands.
Kamran