You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Doing MyBaseModel.objects.bulk_create([MyModelA(), MyModelB()]) results in an AssertionError.
Traceback
/Users/miloi/.local/share/virtualenvs/my-api-vsQQamSD/lib/python3.10/site-packages/rest_framework/mixins.py:19: increateself.perform_create(serializer)
../../../myapi/particle/views/at_event_stream_views.py:28: inperform_createMyBaseModel.objects.bulk_create(new_events)
/Users/miloi/.local/share/virtualenvs/my-api-vsQQamSD/lib/python3.10/site-packages/django/db/models/manager.py:87: inmanager_methodreturngetattr(self.get_queryset(), name)(*args, **kwargs)
__ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self=<InheritanceQuerySet []>objs= [<MyModelA: MyModelAobject (None)>, <MyModelB: MyModelBobject (None)>, <MyModelA: MyModelAobject (None)...odelA: MyModelAobject (None)>, <MyModelB: MyModelBobject (None)>, <MyModelA: MyModelAobject (None)>, ...]
batch_size=None, ignore_conflicts=False, update_conflicts=Falseupdate_fields=None, unique_fields=Nonedefbulk_create(
self,
objs,
batch_size=None,
ignore_conflicts=False,
update_conflicts=False,
update_fields=None,
unique_fields=None,
):
""" Insert each of the instances into the database. Do *not* call save() on each of the instances, do not send any pre/post_save signals, and do not set the primary key attribute if it is an autoincrement field (except if features.can_return_rows_from_bulk_insert=True). Multi-table models are not supported. """# When you bulk insert you don't get the primary keys back (if it's an# autoincrement, except if can_return_rows_from_bulk_insert=True), so# you can't insert into the child tables which references this. There# are two workarounds:# 1) This could be implemented if you didn't have an autoincrement pk# 2) You could do it by doing O(n) normal inserts into the parent# tables to get the primary keys back and then doing a single bulk# insert into the childmost table.# We currently set the primary keys on the objects when using# PostgreSQL via the RETURNING ID clause. It should be possible for# Oracle as well, but the semantics for extracting the primary keys is# trickier so it's not done yet.ifbatch_sizeisnotNoneandbatch_size<=0:
raiseValueError("Batch size must be a positive integer.")
# Check that the parents share the same concrete model with the our# model to detect the inheritance pattern ConcreteGrandParent -># MultiTableParent -> ProxyChild. Simply checking self.model._meta.proxy# would not identify that case as involving multiple tables.forparentinself.model._meta.get_parent_list():
ifparent._meta.concrete_modelisnotself.model._meta.concrete_model:
raiseValueError("Can't bulk create a multi-table inherited model")
ifnotobjs:
returnobjsopts=self.model._metaifunique_fields:
# Primary key is allowed in unique_fields.unique_fields= [
self.model._meta.get_field(opts.pk.nameifname=="pk"elsename)
fornameinunique_fields
]
ifupdate_fields:
update_fields= [self.model._meta.get_field(name) fornameinupdate_fields]
on_conflict=self._check_bulk_create_options(
ignore_conflicts,
update_conflicts,
update_fields,
unique_fields,
)
self._for_write=Truefields=opts.concrete_fieldsobjs=list(objs)
self._prepare_for_bulk_create(objs)
withtransaction.atomic(using=self.db, savepoint=False):
objs_with_pk, objs_without_pk=partition(lambdao: o.pkisNone, objs)
ifobjs_with_pk:
returned_columns=self._batched_insert(
objs_with_pk,
fields,
batch_size,
on_conflict=on_conflict,
update_fields=update_fields,
unique_fields=unique_fields,
)
forobj_with_pk, resultsinzip(objs_with_pk, returned_columns):
forresult, fieldinzip(results, opts.db_returning_fields):
iffield!=opts.pk:
setattr(obj_with_pk, field.attname, result)
forobj_with_pkinobjs_with_pk:
obj_with_pk._state.adding=Falseobj_with_pk._state.db=self.dbifobjs_without_pk:
fields= [fforfinfieldsifnotisinstance(f, AutoField)]
returned_columns=self._batched_insert(
objs_without_pk,
fields,
batch_size,
on_conflict=on_conflict,
update_fields=update_fields,
unique_fields=unique_fields,
)
connection=connections[self.db]
if (
connection.features.can_return_rows_from_bulk_insertandon_conflictisNone
):
>assertlen(returned_columns) ==len(objs_without_pk)
EAssertionError/Users/miloi/.local/share/virtualenvs/my-api-vsQQamSD/lib/python3.10/site-packages/django/db/models/query.py:816: AssertionErrorDestroyingtestdatabaseforalias'default' ('test_my-api-test')...
Environment
Django Model Utils version: 4.3.1
Django version: 4.2.7
Python version: 3.10.7
Other libraries used, if any: djangorestframework, psycopg2, dj-database-url
Problem
Doing
MyBaseModel.objects.bulk_create([MyModelA(), MyModelB()])
results in anAssertionError
.Traceback
Environment
4.3.1
4.2.7
3.10.7
djangorestframework
,psycopg2
,dj-database-url
Code examples
The text was updated successfully, but these errors were encountered: