I have executed
bin/console d:s:u --force
Then the schema was created successfully. However, if I execute this command again, Symfony wants to re-create the schema. How can this be?
See full command line output:
$ bin/console d:s:u --force
Updating database schema...
Database schema updated successfully!
"7" queries were executed
$ bin/console d:s:u --force
Updating database schema...
[Doctrine\DBAL\Exception\TableExistsException]
An exception occurred while executing 'CREATE TABLE message (id INT AUTO_INCREMENT NOT NULL, user_id INT DEFAULT NULL, subject VARCHAR(255) NOT NULL, text VARCHAR(255) NOT NULL, INDEX IDX_B6BD307FA76ED395 (user_id), PRIMARY KEY(id)) DEFAULT CHARACTER SET utf8 COLLATE utf8_unicode_ci ENGINE = InnoDB':
SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'message' already exists
[Doctrine\DBAL\Driver\PDOException]
SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'message' already exists
[PDOException]
SQLSTATE[42S01]: Base table or view already exists: 1050 Table 'message' already exists
I had this config option, which caused the trouble:
schema_filter: "/user_field_data/"
Related
Get error timeout when try query in Clickhouse with large data. So I try query then export csv file and upload it to S3.
Yes, Can do that.
The first ensure s3_create_new_file_on_insert=1 in current Clickhouse database. Else can run. Need permission to execute below script.
SET s3_create_new_file_on_insert = 1
Example:
INSERT INTO FUNCTION s3('https://...naws.com/my.csv', 'KEY', 'SECRET')
SELECT user_id, name
FROM db.users
WHERE application_id =2
More info
https://medium.com/datadenys/working-with-s3-files-directly-from-clickhouse-7db330af7875
https://clickhouse.com/docs/en/sql-reference/table-functions/s3/
I was shared on a BigQuery table that I don't own and I don't have the bigquery.jobs.create permission on the dataset that contains the table.
I successfully listed all the tables in the dataset, but when I tried to query the table using this code:
tables.map(async (table) => {
const url = `https://bigquery.googleapis.com/bigquery/v2/projects/${process.env.PROJECT_ID}/queries`;
const query = `SELECT * FROM \`${table.id}\` LIMIT 10`;
const data = {
query,
maxResults: 10,
};
const reqRes = await oAuth2Client.request({
method: "POST",
url,
data,
});
console.log(reqRes.data);
});
I got the following error:
Error: Access Denied: Project project_id: <project_id>
gaia_id: <gaia_id>
: User does not have bigquery.jobs.create permission in project <project_id>.
I can't ask for those permissions, what should I do in this situation?
IMPORTANT:
I have tried to run the same query in the GCP and it ran successfully, but it seems like it created a temporary table clone and then queried this table and no the original one:
There are two projects here: your project, and the project that contains the table.
You currently create the job in ${process.env.PROJECT_ID} that you use in URL, try specifying your own project instead, where you can create jobs.
You'll need to modify query to include table's project to allow BigQuery to find it, so make sure ${table.id} includes project (table's - not yours), dataset and table.
I need to have a query on DynamoDB.
Currently I made so far this code:
AWSCredentials creds = new DefaultAWSCredentialsProviderChain().getCredentials();
AmazonDynamoDBClient client = new AmazonDynamoDBClient(creds);
client.withRegion(Regions.US_WEST_2);
DynamoDB dynamoDB = new DynamoDB(new AmazonDynamoDBClient(creds));
Table table = dynamoDB.getTable("dev");
QuerySpec spec = new QuerySpec().withKeyConditionExpression("tableKey = :none.json");
ItemCollection<QueryOutcome> items = table.query(spec);
System.out.println(table);
The returned value of table is: {dev: null}, which means the that teh description is null.
It's important to say that while i'm using AWS CLI with this command: aws dynamodb list-tables i'm getting a result of all the tables so if i'm also making the same operation over my code dynamoDB.listTables() is retrieving empty list.
Is there something that I'm doing wrong?
Do I need to define some more credentials before using DDB API ?
I was getting the same problem and landed here looking for a solution. As mentioned in javadoc of getDesciption
Returns the table description; or null if the table description has
not yet been described via {#link #describe()}. No network call.
Initially description is set to null. After the first call to describe(), which makes a network call, description gets set and getDescription can be used after that.
I have the following keys.txt
{
"test": {"BOOL": true}
}
I run the following command
aws dynamodb get-item --table-name marvel-users-prod --key file://keys.json
but its not returning the items that match the key/value. What am I doing wrong?
I get the error
A client error (ValidationException) occurred when calling the GetItem operation: The provided key element does not match the schema
I tried using the GUI but the scan stopped.
Get-item can be used to get the data by primary key. The DynamoDB hash key can't be BOOL type.
The get-item operation returns a set of attributes for the item with
the given primary key. If there is no matching item, get-item does not
return any data.
Create Table Attribute Types:-
"AttributeType": "S"|"N"|"B"
Also, I assume that the boolean attribute that you would like to filter is one of the attributes in the Dynamodb table. You may need to scan the table if you don't include the hash key in the filter criteria.
Scan command:-
"interested" - is my BOOL attribute name
:a - Attribute value placeholder. The value is present in the JSON file
aws dynamodb scan --table-name autotable --filter-expression "interested = :a" --expression-attribute-values file://scan-movies.json
Scan Movies JSON file:-
{
":a": {"BOOL" : true}
}
Steps I did:
1. Deleted migration files.
2.created only one initial migration file.
3. Enter psql command prompt. Connect to database. drop schema public cascade; create schema public;
4.tried to migrate again.
I get MigrationSchemaMissing(Unable to create the django_migrations table (%s) % exc) Error.
This answer and the comment on its question works for me, in brief you must get required grant for a schema as below:
grant usage on schema public to username;
grant create on schema public to username;