View Single Post
11/20/17, 07:21 PM   #12
Solinur
AddOn Author - Click to view addons
Join Date: Aug 2014
Posts: 78
Originally Posted by Uesp View Post
Eso (64 bit if it matters) repeatably corrupts data after 131065 entries.
If you use ZO_SavedVariables you use a few entries up for keys ("Default", "@Reorx" ... in your example) Then you have true as a value the rest goes into the numbered key of your table (1 to 131065) that way you fill up that number of constants to 2^17.

Also, Thanks for all the replies!

If there is a noticeable performance cost I wouldn't want the 64-bit bytecodes system either.

For me it is enough to know how to prevent data loss in the future, since its not hard to go trough the table and find out if the limit is reached.

Also knowing how the problem occurs tells me how I can improve my data format in a way that tries to minimize usage of HD space but also limits the amount of unique values and keys. I'll probably discuss a few ideas in the chat the next days.

On the question why I want to store this much data: Having a full combat log is useful for purposes of theorycrafting or finding bugs in the combat system. Usually I cannot stop during a raid, so looking and filtering through the log at a later point gives me valuable insights into what is happening. Having an option to analyze it after the raid is important for me. Of course the general user doesn't need that, thats why on default the log won't be saved, it requires Shift+Clicking the save button.

I'll probably work on a way to only save selected entries (e.g. only damage events) to improve on this situation. Also having played around I got a good feel for the increase in loading time if a 130 MB file is loaded. Due to this I'll also add a limiter that interacts with the user once a certain size of the saved variables is reached.