IntegrityError when loading Django fixtures with OneToOneField using SQLite - django

When attempting to load initial data via the syncdb command, Django throws the following error:
django.db.utils.IntegrityError: Problem installing fixtures: The row in table 'user_profile_userprofile' with primary key '1' has an invalid foreign key: user_profile_userprofile.user_id contains a value '1' that does not have a corresponding value in user_customuser.id.
There is a OneToOne relationship between the UserProfile model and CustomUser:
class UserProfile(TimeStampedModel):
user = models.OneToOneField(settings.AUTH_USER_MODEL, null=True, blank=True)
Running ./manage.py dumpdata user --format=json --indent=4 --natural-foreign produces the following:
CustomUser Model Data
[
{
"fields": {
"first_name": "Test",
"last_name": "Test",
"is_active": true,
"is_superuser": true,
"is_staff": true,
"last_login": "2014-10-21T11:33:42Z",
"groups": [],
"user_permissions": [],
"password": "pbkdf2_sha256$12000$Wqd4ekGdmySy$Vzd/tIFIoSABP9J0GyDRwCgVh5+Zafn9lOiTGin9/+8=",
"email": "test#test.com",
"date_joined": "2014-10-21T11:22:58Z"
},
"model": "user.customuser",
"pk": 1
}
]
Running ./manage.py dumpdata user_profile --format=json --indent=4 --natural-foreign produces the following:
Profile Model
[
{
"fields": {
"weight": 75.0,
"created": "2014-10-21T11:23:35.536Z",
"modified": "2014-10-21T11:23:35.560Z",
"height": 175,
"user": 1,
},
"model": "user_profile.userprofile",
"pk": 1
}
]
Loading just the CustomUser model's initial data and then following up with UserProfile via load data works great, which suggests to me syncdb is attempting to load UserProfile before CustomUser has been loaded.
If the simplest solution would be to force the load order, what would the simplest way be to do this?

I guess you should use Migrations https://docs.djangoproject.com/en/1.7/topics/migrations/ , they are ordered. But if you using older Django version then 1.7, install south https://south.readthedocs.org/en/latest/

Related

Google cloud storage to bigquery policy tags don't work

Context
On Airflow using the GoogleCloudStorageToBigQueryOperator to load files from Google cloud storage into BigQuery.
Schema as per Bigquery documentation table schema.
Policy tags implemented as per documentation, tested manually via the UI - works as expected.
Blocker
The policy tags are not implemented when the load completes, even though it's specified in the schema fields. The other schema fields work as expected.
import airflow
from airflow import DAG
from google.cloud import bigquery
from airflow.contrib.operators.gcs_to_bq import GoogleCloudStorageToBigQueryOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': airflow.utils.dates.days_ago(2),
'email': ['airflow#example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
}
with DAG(
'gcs_to_bq',
catchup=False,
default_args=default_args,
schedule_interval=None) as dag:
DATASET_NAME="temp"
TABLE_NAME="table"
gcs_to_bq_load = GoogleCloudStorageToBigQueryOperator(
task_id='gcs_to_bq_load',
bucket="temp-bucket",
source_objects=['dummy_data/data.csv'],
source_format='CSV',
skip_leading_rows=1,
write_disposition='WRITE_TRUNCATE',
destination_project_dataset_table=f"{DATASET_NAME}.{TABLE_NAME}",
schema_fields=
[{
"name": "id",
"mode": "NULLABLE",
"type": "INT64",
"fields": []
},
{
"name": "email",
"mode": "REQUIRED",
"type": "STRING",
"description": "test policy tags",
"policyTags": {
"names": ["projects/project-id/locations/location/taxonomies/taxonomy-id/policyTags/policytag-id"]
}
},
{
"name": "created_at",
"mode": "NULLABLE",
"type": "DATE",
"fields": []
}
]
,
dag=dag)
gcs_to_bq_load

How get Cognito users list in JSON-format

I'm going to backup of my Cognito users with Lambda but I can't get Cognito users list in JSON-format with boto3. I do:
import boto3
import os
import json
from botocore.exceptions import ClientError
COGNITO_POOL_ID = os.getenv('POOL_ID')
S3_BUCKET = os.getenv('BACKUP_BUCKET')
ENV_NAME = os.getenv('ENV_NAME')
filename = ENV_NAME + "-cognito-backup.json"
REGION = os.getenv('REGION')
cognito = boto3.client('cognito-idp', region_name=REGION)
s3 = boto3.resource('s3')
def lambda_handler (event,context):
try:
response = (cognito.list_users(UserPoolId=COGNITO_POOL_ID,AttributesToGet=['email_verified','email']))['Users']
data = json.dumps(str(response)).encode('UTF-8')
s3object = s3.Object(S3_BUCKET, filename)
s3object.put(Body=(bytes(data)))
except ClientError as error:
print(error)
But get one string and I'm not sure that is JSON at all:
[{'Username': 'user1', 'Attributes': [{'Name': 'email_verified', 'Value': 'true'}, {'Name': 'email', 'Value': 'user1#xxxx.com'}], 'UserCreateDate': datetime.datetime(2020, 2, 10, 13, 13, 34, 457000, tzinfo=tzlocal()), 'UserLastModifiedDate': datetime.datetime(2020, 2, 10, 13, 13, 34, 457000, tzinfo=tzlocal()), 'Enabled': True, 'UserStatus': 'FORCE_CHANGE_PASSWORD'}]
I need something like this:
[
{
"Username": "user1",
"Attributes": [
{
"Name": "email_verified",
"Value": "true"
},
{
"Name": "email",
"Value": "user1#xxxx.com"
}
],
"Enabled": "true",
"UserStatus": "CONFIRMED"
}
]
Try this:
import ast
import json
print(ast.literal_eval(json.dumps(response)))
For the dict response from the SDK?
Edit: Just realized since the list_users SDK also UserCreateDate object, json.dumps will complain about the transformation due to the datatime value of the UserCreateDate key. If you get that off, this will work without the ast module -
import json
data = {'Username': 'Google_11761250', 'Attributes': [{'Name': 'email', 'Value': 'abc#gmail.com'}],'Enabled': True, 'UserStatus': 'EXTERNAL_PROVIDER'}
print((json.dumps(data)))
> {"Username": "Google_1176125910", "Attributes": [{"Name": "email", "Value": "123#gmail.com"}], "Enabled": true, "UserStatus": "EXTERNAL_PROVIDER"}
You can check the output type by using
type(output)
I guess that it can be list type, so you can convert it into JSON and prettyprint by using:
print(json.dumps(output, indent=4))

How to add superuser in Django from fixture

How to add SUPERuser(not just user) through Django fixtures?
Let's say I wanna have login:admin, password:admin.
solution 1
On empty database:
python manage.py createsuperuser
python manage.py dumpdata auth.User --indent 4 > users.json
and in users.json You have needed fixtures.
solution 2
./manage.py shell
>>> from django.contrib.auth.hashers import make_password
>>> make_password('test')
'pbkdf2_sha256$10000$vkRy7QauoLLj$ry+3xm3YX+YrSXbri8s3EcXDIrx5ceM+xQjtpLdw2oE='
and create fixtures file:
[
{ "model": "auth.user",
"pk": 1,
"fields": {
"username": "admin",
"password": "pbkdf2_sha256$10000$vkRy7QauoLLj$ry+3xm3YX+YrSXbri8s3EcXDIrx5ceM+xQjtpLdw2oE="
"is_superuser": true,
"is_staff": true,
"is_active": true
}
}
]
If you are using pytest-django you can use the existing fixture admin_user.
From RTD:
An instance of a superuser, with username “admin” and password “password” (in case there is no “admin” user yet).

Django - populate table on startup with known values

I have a Dog Model that have a "dog_type" field. i want the dog_type to be chosen from a list of pre-defined dog types, i DO NOT want to use a textfield with choices but a ForeignKey to a "DogType" Model. How could I populate the DogType Model with types on server startup? is this a good practice or a hack?
thanks.
code:
class Dog(Model):
name = CharField(...)
dog_type = ForeignKey(DogType)
class DogType(Model):
type_name = CharField(...)
type_max_hight = IntegerField(...)
etc....
You'll probably want to write a data migration that will add your choices in database.
Advantages of using this approach is that the data will be loaded in all your databases (production, dev, etc.)
(If you're not using migrations yet, you should consider it, it's a clean and well-supported way to manage your database)
In your django project, just run python manage.py shell makemigrations myapp --empty. This will create an empty migration file under myapp/migrations.
You can then edit it:
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
DEFAULT_DOG_TYPES = (
('Labrador', 90),
('Berger Allemand', 66),
('Chihuaha', -2),
)
def add_dog_types(apps, schema_editor):
DogType = apps.get_model('myapp', 'DogType')
for name, max_height in DEFAULT_DOG_TYPES:
dog_type = DogType(name=name, max_height=max_height)
dog_type.save()
def remove_dog_types(apps, schema_editor):
# This is used to migrate backward
# You can remove your dog types, or just pass
# pass
for name, max_height in DEFAULT_DOG_TYPES:
dog_type = DogType.objects.get(name=name, max_height=max_height)
dog_type.delete()
class Migration(migrations.Migration):
dependencies = [
# if you're already using migrations, this line will be different
('myapp', '0001_initial'),
]
operations = [
migrations.RunPython(add_dog_types, remove_dog_types),
]
After that, all you need to do is to run python manage.py syncdb.
not so far after, i found this it's called "fixtures"...
basically what you need to do is to place a "fixture" file in a format of your choice (JSON\YAML...) "myapp/fixtures/" for example, a JSON file would look like this:
[
{
"model": "myapp.person",
"pk": 1,
"fields": {
"first_name": "John",
"last_name": "Lennon"
}
},
{
"model": "myapp.person",
"pk": 2,
"fields": {
"first_name": "Paul",
"last_name": "McCartney"
}
}
]
then simply run from the command line:
python manage.py loaddata <filename> # file with no path!

django-social-auth: logging in from a unit test client

I use django-social-auth as my authentication mechanism and I need to test my app with logged in users. I'm trying:
from django.test import Client
c = Client()
c.login(username='myfacebook#username.com", password='myfacebookpassword')
The user which is trying to login succeeds to login from a browser. The app is already allowed to access user's data.
Any ideas how to login from a unittest when using django-social-auth as the authentication mechanism?
Thanks
Create a fixture with User instances
{
"pk": 15,
"model": "auth.user",
"fields": {
"username": "user",
"first_name": "user",
"last_name": "userov",
"is_active": true,
"is_superuser": false,
"is_staff": false,
"last_login": "2012-07-20 15:37:03",
"groups": [],
"user_permissions": [],
"password": "!",
"email": "",
"date_joined": "2012-07-18 13:29:53"
}
}
Create a fixture with SocialAuthUser instances like this
{
"pk": 7,
"model": "social_auth.usersocialauth",
"fields": {
"uid": "1234567",
"extra_data": "%some account social data%",
"user": 15,
"provider": "linkedin"
}
}
So you will get the user, who has the same behavior as a real user and has all the social data you need.
Set the new password and then you can use the auth mechanism for log this user in:
...
user.set_password('password')
user.save()
logged_in = self.client.login(username='user', password='password')
and then just call the view with login required
self.client.get("some/url")
Don't forget, that django.contrib.auth.backends.ModelBackend is needed, and django.contrib.sessions should be in your INTALLED_APPS tuple
Also, the advantage of using standard auth is that you don't need to make a server request for getting oauth tokens and so on.