Go to Page... |
Thread Tools | Display Modes |
11/19/17, 03:39 PM | #1 |
|
[implemented] Issues when saving/opening a lot of data to/from saved variables
Hi,
I wonder what the true and exact limits of saving data into saved variables are. I played a bit around with a simple setting where I directly accessed the Global Variable defined in the Manifest file. I managed to create a savedvar with 2.7 MB that gets corrupted upon loading by simply doing the following: Lua Code:
if entries is bigger than 131072 the last and first entry become weird Lua Code:
it uses its own handle and value as table key ... This usually happens on load, when I check the file before loading I can see in a texteditor that everything is still fine. If I save longer values e.g. Lua Code:
then the max number decreases to ~99.5k Lua Code:
can take about 52k then again Lua Code:
I would like to somehow make sure that I don't create corrupted saved variables (but still be able to save a lot of data). It somehow feels like a bug to me, but it might be something that cannot be changed. So if the restriction cannot be lifted, I at least would like to have a way to tell if my file will be fine or not. |
11/19/17, 06:08 PM | #2 |
|
Through personal experience and that of other devs: even 10k table entries can have unexpected results.
|
11/19/17, 06:58 PM | #3 |
|
I've been testing this too and while I've found results similar to yours I'm left more confused now about exactly what the limit is and how it changes depending on the content. I've made an empty addon that just populates the saved variable file with a structure like:
Code:
TestNoneLogSavedVars = { ["Default"] = { ["@Reorx"] = { ["$AccountWide"] = { ["test1"] = { }, ["version"] = 3, }, }, }, }
So more data but I'm not sure I'm any closer to understanding the exact issue or exactly when it occurs. I know in our uespLog addon I've limited the logging data array to 60k elements to prevent issues although I do run into random corruption during various data mining operations involving large amounts of data. It sort of looks like someone is using a fixed buffer with a size around 131072 elements which is somehow overflowed during file loading. I assume the ESO code somewhere is just using a dofile()/loadfile()/dostring() Lua API call and if so the issue would in the Lua library somewhere. Or perhaps the ESO code uses a custom file load which has a bug in it. Would be nice if a ZOS dev (Chip?) could step through the saved variable loading with a known bad file to see where the issue is, confirm it, and hopefully fix it. For the record, all you need to is create an addon with saved variables that outputs a single large array like: Code:
function testNone.SetVar1(count) if (count == nil) then count = 131072 end testNone.savedVars.test1 = {} for i = 1, count do testNone.savedVars.test1[i] = true end end |
11/20/17, 06:58 AM | #4 |
|
I might have found a clue.
Shinni pointed out that lua stores each string only once, preventing a duplicate string to be stores again. Maybe its the same with values. This would make it very memory efficient and maybe explain how certain limits come into place. It would probably mean that there is a fixed number of different keys + values (my guess: 131072) a table can have, be it strings or values. for example: Lua Code:
and calling it with N = 131 and N2 = 1000 leads to a corrupted file since numbers 1 to 1000 are used as keys and 131000 strings of type "i|k" are created. This means we have 132000 different elements in use causing it to fail upon recreation from file and starting to address them from the beginning again. Calling said function with N=130 and N2 = 1000 seems to be fine (would be 131000 different keys+values) Edit: Just saw that uesp had the same conclusion at the end. Didn't read it in detail as I already had an idea in my mind that I wanted to note down ![]() |
11/20/17, 10:15 AM | #5 |
There is a limit on the number of constants (unique strings in this case) which comes from using 32-bit bytecodes. We could switch to 64-bit, but there is a memory and performance cost to doing that (for all addons and our UI).
|
|
11/20/17, 10:40 AM | #6 |
Before changing this, I would ask first Why. ESO is now 3 years old and we managed to live with this without, so we could maybe try together to bypass that limit by working differently.
If it's for the bull**** of https://forums.elderscrollsonline.co...le-size-limits I would simply add : LEARN TO CODE If you have an example of an addon storing more than 130k keys. Please show us your addon, how you save data, etc etc. And I can already say that saving an entire Combat Metric log for each combat event in saved vars is not a good idea. There is simply too much data. I had pChat tables of 90K lines and lorebooks tables of 95k datamined entries and I managed to make it work. Not alone, Not without pain, Not quickly, but it has been done. |
|
11/20/17, 11:02 AM | #7 |
@Chip How much of a performance impact are we talking about? Would it be noticable?
@Ayantir Even if it is an old topic, I have to agree that corrupting save data on load is a bug and should be fixed. Maybe the game should stop writing data before it reaches the limit, or skip saving a new version of a file if it has too many strings? Loosing the data of one session is IMO still better than loosing all data. The ZO_SavedVariables class could also offer some way for addons to determine how much "space" is still left to give them a way to determine if they should remove old entries in what ever way they need. |
|
11/20/17, 11:06 AM | #8 |
|
I can understand saving massive quantities of data to the saved vars while doing dev stuff, but for actual users to be doing so? Loading all of that data/writing to it would be a significant performance drain. Every time I go on the PTS, it's amazing how fast this game loads when you don't have addons. I don't even have that many addons, but I do have a few big ones. More addons doing things with even larger data sets just seems like a bad idea, especially when the layman won't know (beforehand) what a massive hit they are going to take when using such an addon.
|
11/20/17, 11:58 AM | #9 | |
|
Is there any way that corruption and loss of data can be prevented when loading files with "too many" entries? I'd rather than entries past the "too many" point just be ignored rather than corrupt the entire file. As for Ayantir's question on why, I think that so long as it is not a huge amount time/effort to at least prevent data corruption it is worth it, even if only a fraction of users ever encounter it. Also worth pointing out that *now* it only affects a some number of users but depending on the nature of the bug it may well start affected more and more users at some point. I run into data corruption all the time as I deal with with very large amounts of logged data from the game but my case is definitely unique. I also don't believe it is as simple as limiting arrays to <50k of elements as even with smaller arrays I've encountered data corruption. |
|
11/20/17, 12:18 PM | #10 |
|
Couple of references I've found in case anyone else happens to be interested in the technical details of this issue:
From the first link the limit seems to be 2^18 literal constants per source function (i.e., a single saved variable file). So its not nearly as simple as just limiting array sizes to some arbitrary value and would explain why I've run into data corruption issues with relatively small arrays of deeply nested data. Edit: Also explains why the issue is at load time and not save or run time. At run time a table can have as many elements as memory allows. Saving is fine as you are just serializing the data to a string format. At load time, however, you are converting the string table format to VM byte-code which is where the 2^18 issue lies. I'm surprised this sort of thing doesn't result in a Lua run time error as the library is pretty good about detecting these. Last edited by Uesp : 11/20/17 at 12:43 PM. Reason: addt |
11/20/17, 01:29 PM | #11 |
|
So with some more digging I have some answers but more questions. I'm not entirely sure the issue is due to the overflow of the constant table but may be related to it somehow.
The following assumes a minimal saved variable file with N entries of true as used previously.
So there's 2 probably related issues here:
Hopefully the crash can be avoided by adding some error checking and the cause of the data corruption still need to be confirmed. |
11/20/17, 07:21 PM | #12 |
|
If you use ZO_SavedVariables you use a few entries up for keys ("Default", "@Reorx" ... in your example) Then you have true as a value the rest goes into the numbered key of your table (1 to 131065) that way you fill up that number of constants to 2^17.
Also, Thanks for all the replies! If there is a noticeable performance cost I wouldn't want the 64-bit bytecodes system either. For me it is enough to know how to prevent data loss in the future, since its not hard to go trough the table and find out if the limit is reached. Also knowing how the problem occurs tells me how I can improve my data format in a way that tries to minimize usage of HD space but also limits the amount of unique values and keys. I'll probably discuss a few ideas in the chat the next days. On the question why I want to store this much data: Having a full combat log is useful for purposes of theorycrafting or finding bugs in the combat system. Usually I cannot stop during a raid, so looking and filtering through the log at a later point gives me valuable insights into what is happening. Having an option to analyze it after the raid is important for me. Of course the general user doesn't need that, thats why on default the log won't be saved, it requires Shift+Clicking the save button. I'll probably work on a way to only save selected entries (e.g. only damage events) to improve on this situation. Also having played around I got a good feel for the increase in loading time if a 130 MB file is loaded. Due to this I'll also add a limiter that interacts with the user once a certain size of the saved variables is reached. |
11/20/17, 07:54 PM | #13 | |
|
It's also non-obvious by just looking at the file sizes. For example, I have a 140MB file that only uses 50k constants while a 12MB file uses 120k constants. It matters more on how many unique constants you have rather than the actual file size. It should be possible to count the number of constants in a saved variable from within the Lua API in ESO. Just find the root of the saved variable, iterate through it, and say all strings/numbers/true/false/nil into a table as keys, then count the number of keys in that table. At the very least it would let you know if you are getting close to the problem size. I also don't have any idea why the problem occurs at 2^17 instead of 2^18 like it should according to the technical details. Offhand I'd guess a signed/unsigned issue but don't see any issue in the code itself and the Lua API code works fine up to 2^18 indicating some issue relating to ESO or the version of Lua it uses. |
|
11/21/17, 05:12 PM | #14 |
We'd have to do profiling to see what the impact is. We haven't explored it yet.
|
|
01/17/18, 10:38 PM | #15 | |
Join Date: Nov 2017
Posts: 1
|
so with the upcoming changes, namely no longer supporting the 32bit client, will that have any impact on this situation?
|
|
04/02/18, 09:34 AM | #16 |
With Summerset we've changed to using the 64bit byte codes which should fix this.
|
|
03/14/19, 05:45 PM | #17 |
|
Ok this is probably overdue, but since this is implemented now and can easily load more than 10 million constants (where loading times increase to very noticeable amounts, so don't do it). This is "fixed" or rather improved. This thread can be closed.
|
ESOUI » Developer Discussions » Wish List » [implemented] Issues when saving/opening a lot of data to/from saved variables |
«
Previous Thread
|
Next Thread
»
|
Thread Tools | |
Display Modes | |
|
|