-
Notifications
You must be signed in to change notification settings - Fork 493
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory #106
Comments
I've tested "long string" vs "array" version of cleanup xml code, there 3x memory overhead in array version |
For what I needed to do, I ended up asking the client to use CSV file instead of XLSX which solves my immediate issue. So I don't have to fix this issue. I don't know if you want to debug it further. We can close this issue if you wish |
same problem here: |
check your memory_limit there not 2GB |
@shuchkin it is correct, memory_limit in phpinfo() shows 2G |
Okay, after more testing, I've managed to see that on 39000 rows it crashes, and when I cut 5000 rows, now with 34000 rows it works fine. |
I think after preg_replace garbage collector is not called try this version https://github.com/shuchkin/simplexlsx/blob/master/src/SimpleXLSX.php |
nope, still the same errors:
|
@shuchkin any ideas? |
try 1.0.12 |
Apologies for chiming in late but I'm experiencing the same issue and have done for a while now. From what I've researched the issue is how excel determines the final row of the spreadsheet. Excel seems to add additional blank rows without logic or reason. If you open your spreadsheet that fails, highlight the worksheet (CRTL+A or CMD+A) you'll notice that the select all continues way passed any data. Look at the scroll bar for reference as you scroll!!! If you open the file in BBE edit or similar you can see thousands of empty rows in the content. When you open the spreadsheet for processing the code throws a fatal exception at: Line 608 - getEntryData - $entry['data'] = gzinflate($entry['data']); Using the attached spreadsheet as an example, strlen($entry['data']) hits 18174506! So when you try to inflate the content the server runs out of memory (128MB). FYI spreadsheet size is clearly not 18.2MB from the small amount of data in it. The workaround I've used that works 100% of the time is to manually highlight the data cells and copy to a brand new spreadsheet. The only way I can think of fixing this is in code is to pre-parse the spreadsheet, reading in a binary stream of bytes removing the empty rows before loading the file into memory...? See: https://stackoverflow.com/questions/11265914/how-can-i-extract-or-uncompress-gzip-file-using-php |
try comment this line, but results can be unpredictable |
Of course, I already use readRows(), because the files can be very large (up to ~40 MB) - all the results are with it :/ |
Very weird error, happening only in browser, not in cli
using v0.8.23
I checked php memory_limit it is 128M both for php cli and php browser.
Computer has 16Gb ram, and is 25% used.
I downloaded the xlsx, run it on my computer and it worked even in browser. Xlsx file is 2.5Mb, has about 45000 lines and only one sheet.
As far as I can understand, this is the line where it happens:
I printed the variable to a file before it crashes, and it generated a 17Mb file, all in one line, so the preg_replace above is trying to operate on that large string and crashes.
I don't know why it works on my computer, but not on the other one, and why in the other one it works in cli and not in browser.
Unfortunately xlsx file contains sensitive info, I can't share it, and right now I don't have a reproducible file to share.
The text was updated successfully, but these errors were encountered: