I'm trying to connect to Athena using the JDBC drivers provided by Amazon, and using SQL Developer as the client. So far, I haven't had any luck with Java 1.8.181 and AthenaJDBC42-2.0.7.jar Has anyone had any luck on this front? Before I try mixing up which versions of Java, JDBC driver, and/or SQL Developer, I thought I'd at least ask if anyone has been successful using SQL Developer with the Athena JDBC drivers.
No.
SQL Developer doesn't allow for just any JDBC driver to be added...we restrict connectivity to the platforms we officially support for database migrations to the Oracle Database platform.
Athena doesn't have migration support, hence the lack of connectivity. If you need assistance with a migration, please send me a note.
Related
We have an older version (I think its 9) of informix running on our server that is the backend db for a vendor software. I have an ODBC for it to ms access but now they want to do powerbi stuff. Im finding the odbc is throwing error on certain tables and given that its vendor software, they're less inclined to upgrade the version. Everything I've seen seems to show that Informix doesn't like slaves that aren't informix and cloud isn't an option. Any suggestions for getting this data on a different db other than pulling csv files?
Has anyone tried to connect power BI tool with Presto?
I have used the connectors and driver available on the Qubole documentation site. I have installed the drivers and connectors but it is throwing error on accessing S3 location.
Qubole drivers and connector will work only with Qubole's Managed Presto Service.
Disclaimer: I work for Qubole.
I have teradata files on SERVER A and I need to copy to Server B into HDFS. what options do i have?
distcp is ruled because Teradata is not on HDFS
scp is not feasible for huge files
Flume and Kafka are meant for Streaming and not for file movement. Even if i use Flume using Spool_dir, it will be an overkill.
Only option I can think of is NiFi. Does anyone has any suggestions on how can i utilize Nifi?
or if someone has already gone through these kind of scenarios, what was the approach followed?
I haven't specifically worked with Teradata dataflow in NiFi but having worked with other SQL sources on NiFi, I believe it is possible & pretty straight-forward to develop dataflow that ingests data from Teradata to HDFS.
For starters you can do a quick check with ExecuteSQL processor available in NiFi. The SQL related processors take one DBCPConnectionPool property which is a NiFi controller service which should be configured with the JDBC URL of your Teradata server and the driver path and driver class name. Once you validate the connection is fine, you can take a look at GenerateTableFetch/ QueryDatabaseTable
Hortonworks has an article which talks about configuring DBCPConnectionPool with a Teradata server : https://community.hortonworks.com/articles/45427/using-teradata-jdbc-connector-in-nifi.html
I'm working with c++ visual studio 2015, currently i'm using sql server for database but now i'm switching on postgres DB but i'm not getting any relevent OLE DB consumer/provider please suggest.
You can either work with PGNP OLEDB Provider for PostgreSQL, Greenplum and Redshift or use a combination of Microsoft OLE DB Provider for ODBC and psqlODBC.
Although the second combination involves more components (and, probably, more layers), it's probably more up-to-date.
Did I make wrong to use firebird database I don't know. It has lot's of good futures but I can't figure out why my query (stored procedure) didn't work.
Is there any profiler/monitoring tool for firebird?
Firebird database is working stand alone so it is embeded db. And It doesn't allow to connect with 2 users. If there is a profiler I wonder how it will connect while I'm executing my queries.
IBExpert and Database Worbench have stored procedure debugger
There is also many monitoring tools http://www.firebirdfaq.org/faq95/
I advice you to install server version if you want to have more than 2 users