View Single Post
11/20/17, 11:58 AM   #9
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
Originally Posted by ZOS_ChipHilseberg View Post
There is a limit on the number of constants (unique strings in this case) which comes from using 32-bit bytecodes. We could switch to 64-bit, but there is a memory and performance cost to doing that (for all addons and our UI).
Could you explain this a little more? I don't see how this would explain an array of 131072 "true" values becoming corrupted on load, or explain why it doesn't get corrupted on save or while in memory but only on loading the file. I also can't replicate it by using the raw Lua C API, either 5.1 or 5.2...they both seem to be able to load files fine that get corrupted in ESO.

Is there any way that corruption and loss of data can be prevented when loading files with "too many" entries? I'd rather than entries past the "too many" point just be ignored rather than corrupt the entire file.

As for Ayantir's question on why, I think that so long as it is not a huge amount time/effort to at least prevent data corruption it is worth it, even if only a fraction of users ever encounter it. Also worth pointing out that *now* it only affects a some number of users but depending on the nature of the bug it may well start affected more and more users at some point. I run into data corruption all the time as I deal with with very large amounts of logged data from the game but my case is definitely unique. I also don't believe it is as simple as limiting arrays to <50k of elements as even with smaller arrays I've encountered data corruption.