Does anything like Apex exist for other databases? - oracle-apex

I used Apex just a bit years ago. Apex fills me with admiration, it effectively reduces the complexity of publishing data to the web. I'm very good with relational databases but never took to Oracle, and that's part of why I didn't stick with Apex.
For all of these years I've been hoping that something like Apex would show up that would work with sql server, or postgresql. Does anyone know if such a thing exists?

Related

Django supported alternative to noSQL

We need a reasonable insert and query speed over huge tables so I considered using some noSQL adapter with Django. Unfortunately:
Django does not provide official support for noSQL databases.
In our original schema some Big Data are relational to other Big Data making the data duplication unacceptable.
Project deadlines are enemies of hot stuff like this.
So, as far I can see, PostgreSQL should be the way to go for this scenario, right?!
Please let me know any other detail that may be relevant to this question!
Bonus to anyone that can point out some useful database techniques like database sharding...
Well, there is a fork of django project that uses MongoDb as the backend.You can read about it here . The Code on GitHub is here.You give some heads up, MongoDB is a NOSQL db that does support sharding and replication. So i think this might something that you are looking for.

How to move from a database backend to another on a production Django project?

I would like to move a database in a Django project from a backend to another (in this case azure sql to postgresql, but I want to think of it as a generic situation). I can't use a dump since the databases are different.
I was thinking of something at the django level, like dumpdata, but depending on the amount of available memory and the size of the db sometimes it appears unreliable and crashes.
I have seen solutions that try to break the process into smaller parts that the memory can handle but it was a few years ago, so I was hoping to find other solutions.
So far my searches have failed since they always lead to 'south', which refers to schema migration and not moving data.
I have not implemented this before, but what about the following:
Django supports multiple databases...so just configure DATABASES in your settings file to support the old postgresql database and the azure sql database. Then create a small script that makes use of bulk_create, reading the data from one DB and writing it to the other.

Why does django community encourage the use of Postgres over Mysql?

Django community seems to encourage the use of Postgres, I understand that in a big project you probably don't want to use SQLite or such, but i don't know why they don't like Mysql that much.
just a quick example - Djangobook page 9:
We’re quite fond of PostgreSQL ourselves, for reasons outside the scope of this book, so we mention it first. However, all those engines will work equally well with Django.
Postgres name is always associated with Django - just like Mysql is always associated with php
Best answer might be given to this quiestion is from those who writes the Framework...
But from what you can get from documentation is:
By default, Django starts a transaction when a database connection is first used and commits the result at the end of the request/response handling. The PostgreSQL backends normally operate the same as any other Django backend in this respect.
Django have a strong Transection control, and using a powerful free DBMS that have strong transection control is a plus...
Also, previous versions of Mysql (before mysql 5.0), MySql have some integrity problems. Also MySql's fast storage engine MyISAM do not support foreign keys and transections. So using MyISAM have important minuses that its pluses...
And a wiki for why Postgres is better than Mysql. Its quite old but it is good.
I have never used PostgreSQL myself, so I can't really say much about the advantages it is supposed to have over MySQL.
But from what I've gathered, transaction handling is better supported by PostgreSQL out of the box. If you use MySQL, changes are that you will use MyISAM as storage engine, which doesn't support transactions.
https://docs.djangoproject.com/en/dev/topics/db/transactions/#transactions-in-mysql
Maybe the Django devs just got sick and tired of having to deal with bug reports where transactions didn't work, but the problem was due to MyISAM and not Django.
The South developers (the most used database schema migration framework for Django) apparently aren't to fond of MySQL either, which this message suggests that I've seen quite often with MySQL:
! Since you have a database that does not support running
! schema-altering statements in transactions, we have had to
! leave it in an interim state between migrations.
[...]
! The South developers regret this has happened, and would
! like to gently persuade you to consider a slightly
! easier-to-deal-with DBMS.
I've always used Postgres whenever possible, partly because of maturity and because of the PostGIS extensions that add spatial data capabilities to the database. Even if I don't think I'm going to want spatial data in my application at the beginning its much easier to add it on if your DB supports it, rather than have to tear out MySQL at a late stage and replace it with PostGIS.
I think there is a spatial extension to MySQL now, so you might be able to do spatial operations in that now. But Postgres just does it and has been doing it for years.
Or I could spend $$$$$ for Oracle Spatial, I suppose...

Database choice choice for Django project

Django (http://djangoproject.com) framework currently supports the following databases: PostgreSQL, SQLite 3, MySQL 5 and Oracle. This question is not about the comparison of these databases, contrariwise I want to know details about their compatibility with Django and how one should choose an adequate database for a simple (but growing) project.
It depends. If you don't want to pay the (huge) premium for Oracle, your choice is between MySQL and PostgreSQL (SQLite is mostly meant for development, not production). PostgreSQL seems to be the choice of most Django core devs (Andrew Godwin went as far as putting "friends don't let friends use MySQL" into a DjangoCon talk). Nevertheless, MySQL is fully supported by Django and is used by many Django websites in production.
IMHO, PostgreSQL has two clear advantages over MySQL:
If you use GeoDjango. There's really no alternative to PostGIS. MySQL's GIS support is lackluster at best.
South. MySQL is not able to use transactions during schema migrations, which means if a South migration goes awry, you're hanging between two migrations without clear way forward or backward. PostgreSQL saves a lot of pain in such situations.
The quick answer is it doesn't really matter but if I had to pick one, it would probably be PostgreSQL.
The original Django developers heavily recommended PostgreSQL in the early days. PostgreSQL is arguably/technically a 'better' database then MySQL so if you've never used it before it might be worth investing some time to lean it.
Now-a-days though I'd say it's safe to assume that all the DB drivers work equally well. So if you are more familiar with MySQL or Oracle there should be no problems using them instead. I've only used Django with MySQL and haven't had any problems.
All of the pluggable databases are supported, meaning, the the model/db-layer will abstract the db-specifics for you. As for performance, I'd suggest MySQL 5 or Postgresql. Oracle will definitely perform well, but might be more expensive in the long run. If you are just started your project, sqlite3 is your friend, simply because the it will be setup within seconds.
Handling parallel settings for development and production has been discussed in various blog posts, e.g. in this one:
http://ericholscher.com/blog/2011/jan/10/handling-django-settings-files/

Is there a good in-memory database that would act like DB2 [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I am currently using DB2 to do unit tests, but it is sometime quite slow. I would need a good in-memory database that would include all the feature of DB2. Does this type of in-memory database exist, or do they only allow standard SQL feature?
Thank you.
EDIT
The DB2 Database is on a remote server, so I would need a solution to replicate the schema of that database on a local in-memory database to speed up the tests.
I would need a good in-memory database that would include all the feature of DB2.
Derby (ex Cloudscape) is DB2's language compatible. And it has an in memory mode.
Maybe also have a look at H2 (with the DB2 compatibility mode). But even if H2 would be faster, I would consider Derby in your case.
I think the easiest option would be --- DB2.
Download the freebie express edition and install it on your PC. You slow speed is almost certainly down to network bottlenecks and limitations using the DB2 client, installing locally would eliminate these.
The next best thing would be JavaDB (used to be known as Derby!). Which is similar but not quite identical to DB2.
Using any other database will dump you in a quagmire of unsupported features, incompatable SQL, different names for the same function, different functions with the same function name etc.etc.
Why not to use tmpfs (Unix), or any conventional ramdrive solution for Windows?
Or, you can get a fast SSD.
As mentioned by Pascal, Derby is almost identical syntactically to DB2. Having tried it for the same thing a while back we had an issue with Derby not supporting sequences, but the connection protocols are identical. Eg you can use the DB2 JDBC driver to connect to a Derby database.
If your application abstracts your database connection with say Hibernate, or NHibernate (if .Net) using H2 or HSQLDB could also work for unit tests assuming you aren't relying on stored procedures and the like.
Another tool that helps out with schema migration that targets multiple DBs is http://liquibase.org . For a test build you make it build your in memory DB and for a deployment you make it build your DB2 database or generate the migration script. It will build the database with the correct schema and offers conditional migrations (eg grant is not available in HSQL so you run the change set for just DB2).
Having no idea what features DB2 has, but sqlite can create in-memory database. You might want to take a look at it.
If you make your bufferpool large enough, your DB2 database will be in-memory, too.
Two possible in-memory databases are:
- timesTen from Oracle.
- SolidDB from IBM. This one could send the transaction back to DB2, but the queries will have a really high performance
For more information about soliddb http://www-01.ibm.com/software/data/soliddb/