Working with Mysql stored procedures in pythonanywhere - django

I have imported mysql workbench data along with stored procedures from local to pythonAnywhere.
However the query in stored procedures are not being executed properly. I am trying to order them in desc order, but it is not working.
Can anyone guide?

Related

I did Migration process and I didn't get error but I am missing my database and table in sql. VS not transferring the data to sql

Hi I am the beginner developer, I am trying to send my data to sql from VS and I did Migration process. in packing manager console I wrote enable-migration and now I got configuration file also and after than I changed the value from false to true and wrote update-database I am not getting error its just says
PM> update-database
Specify the '-Verbose' flag to view the SQL statements being applied to the target database.
No pending explicit migrations.
Running Seed method.
and when I checked sql there is no my database also any tables
I am trying to make travel website and there are some page exist and I got some models files of them and I guess I need to create connection between sql and vs I had the code in vstudio but when I try to create database and tables with my datas in studio, its not happening

How to run dynamic queries in Informatica cloud mapping task?

I am new in informatica cloud. I have list of queries ready in my table. Like below.
Now I want to take one by one query from this table which work as a source query and whatever results return which I need to load into target. All tables were already created in source and target.
I just need to copy the data based on dynamic queries which kept in my one of sql tables.
If anyone have any idea then please share your toughs with me. It great helps to me.
The source connection will be the connector to your source database and the Source Type will be query. From there it depends how you are managing your variables. See thread on Informatica Network for links to multiple examples.
Read the table like normally you would do in the cloud. Then pass each of the record into the sql transformation for execution. configure where the sql transformation has to execute and it will run the queries in the database you want.
you can use a SQL task to run dynamic SQL queries.
link to using SQL task approach: https://www.datastackpros.com/2019/12/informatica-cloud-incremental-load_14.html

How to read data from a POSTGRESQL database using DAS?

We are working on the ETL. How to read data from the POSTGRESQL data base using streams in DATA ANALYTICS SERVER and manipulate some operations using the streams and insert the manipulated data into another POSTGRESQL data base on a scheduled time. Please share the procedures to follow.
Actually, you don't need to publish data from your PostgreSQL server. Using WSO2 Data Analytics Server (DAS) you can pull data from your database and do the analysis. Finally, you can push results back to the PostgreSQL server. In DAS, we have a special connector called "CarbonJDBC" and using that connector you can easily do this.
The current version of the "CarbonJDBC" connector supports following database management systems.
MySQL
H2
MS SQL
DB2
PostgreSQL
Oracle
You can use following query to pull data from your PostgreSQL database and populate a spark table. Once spark table is populated with data, you can start you data analysis tasks.
create temporary table <temp_table> using CarbonJDBC options (dataSource "<datasource name>", tableName "<table name>");
select * from <temp_table>;
insert into / overwrite table <temp_table> <some select statement>;
For more information regarding "CarbonJDBC" connector please refer following blog post [1].
[1]. https://pythagoreanscript.wordpress.com/2015/08/11/using-the-carbon-spark-jdbc-connector-for-wso2-das-part-1/

MySQL mysqldump exporting incomplete files?

I am trying to export a MySQL database from a Amazon Web Service Relation Database Service. I have tried connecting and exporting through MySQL Workbench and tried through command line in the EC2 server. But the tables exports still end up incomplete.
One table exports the first 5459 rows (1.1Mb) and then stops half way through the row.
another table exports the first 676 rows (7.7Mb) and then stops half way through the row again, this table should have about 39000 rows.
I haven't used Amazon Wen Services before or Linux or MySQL, I normally work on Windows servers with Microsoft SQL. Has anyone had this error before or know what the cause could be?

Django switch from sqlite3 to Postgresql issues

I've recently changed the database server on my project from sqlite3 to Postgresql and I have a few questions that I hope will give an answer to my issues.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
from ps.psdemo import Product
where ps is my application, psdemo is my database and Product the table in the database.
In conclusion I'm trying to get access to my database and tables but I can't manage to find them. I repeat, there is no new database file in my project or in my virtual environment (I've searched thoroughly) but if I use a terminal connection I can connect to my virtual environment and change directories to get to the application folder then if I connect to the Postgresql server I can create the database, the tables and can Insert into them, make queries etc, but I cannot access them from the Django code.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
All you have to do with postgres is create the database. Not the tables. Django will create the tables, and anything else it thinks are useful, once you call syncdb.
You won't have any new files in your project like you did in sqlite. If you want to view your database, you should download and install pgadminIII (which I would recommend in any event)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
Here, you import models via normal python syntax and it then references your tables. Each model should represent a single table. You define your models first, and then call
python manage.py syncdb
In conclusion I'm trying to get access to my database and tables but I can't manage to find them.
See above, but you should definitely read about postgres installation from the postgres docs, and read the psycopg2 docs as well as the Django docs for setting up a postgres database.
I understand that switching from sqlite to Postgres implies that I
create the new database and the tables inside it, right? I've done
that but I haven't seen any new files created in my project to show me
that the database I've made is visible. (Btw, I've changed the
database name in settings.py)
Database files are not created in the project directory with postgresql. They are created in the database server data directory (like /var/lib/postgres it depends on the distribution). You should generally query it through a PostgreSQL client that connects to the PostgreSQL server rather than messing with the files directly.
You can for example run command:
manage.py dbshell
As to your first issue, see #jpic's answer.
On your second issue, your database is not a package, and you do not import models from your database. If you were able to import your models correctly before you made any changes, change your import statements back to how they were.