I have a Data Table that can be fairly large (it is created dynamically based on user defined filters), i.e. several hundreds of thousands of rows. I would like to download that Data Table as Excel via the API.
Sometimes, I get the a warning like this
"The downloaded file contains only 90909 of 319117 rows because the Data Table is too large (max. 1000000 cells)."
Is this limit of 1 million cells a hard limit or just a safety precaution?
The value can be increased via backend (OD Core) property dataset.export.cellCountMax
.
There are, however, physical limitations:
- The XLSX sheet generated must not exceed the format’s physical limit for rows and columns (these are hard limits of the standard afaik). 319117 rows should be way in the comfort zone, though.
- The generation of the XLSX is done via a library that has to operate with all data completely in memory. Therefore RAM limitations might apply. Since the data is kept in at least two redundant representations during the generation (and also the lib is not the most modest one when it comes to RAM consumption), this limit is a rather real obstacle. But you can give it a try by increasing in steps and monitoring the resources closely.
1 Like