You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: out of memory error when importing under postgresql
If you try importing more than 20k items under postgresql you can run
out of memory:
psycopg2.errors.OutOfMemory: out of shared memory
HINT: You might need to increase max_locks_per_transaction.
Tuning memory may help, it's unknown at this point.
This checkin forces a commit to the postgres database after 10,000
rows have been added. This clears out the savepoints for each row and
starts a new transaction.
back_postgresql.py:
Implement commit mechanism in checkpoint_data(). Add two class level
attributes for tracking the number of savepoints and the limit when
the commit should happen.
roundup_admin.py:
implement pragma and dynamically create the config item
RDBMS_SAVEPOINT_LIMIT used by checkpoint_data.
Also fixed formatting of descriptions when using pragma list in
verbose mode.
admin_guide.txt, upgrading.txt:
Document change and use of pragma savepoint_limit in roundup-admin
for changing the default of 10,000.
test/db_test_base.py:
add some more asserts. In existing testAdminImportExport, set the
savepoint limit to 5 to test setting method and so that the commit
code will be run by existing tests. This provides coverage, but
does not actually test that the commit is done every 5 savepoints
8-(. The verification of every 5 savepoints was done manually
using a pdb breakpoint just before the commit.
acknowledgements.txt:
Added 2.4.0 section mentioning Norbert as he has done a ton of
testing with much larger datasets than I can test with.
0 commit comments