Scenario:
I run SAS programs from Sasfusion server abc123.
I run regular SAS programs with proc contents, proc prints, proc exports, etc with no issue.
I was just requested to access a SAS dataset in location on a separate server zyx123 (not sure if it is Sasfusion or not)
I thought this would be something simple like putting in a libname statement.
libname z '\path to the folder\' server=zyx123;
proc contents data = z.requesteddataset;
run;
Is it possible to even do the above?
I get the following errors:
ERROR: Libref TEST is not assigned.
ERROR: Error in the LIBNAME statement.
ERROR 23-7: Invalid value for the SERVER option.
The LIBNAME option SERVER= is used to access to data sets on a machine with SAS/SHARE licensed running a SAS session in which Proc SERVER is executing.
See the SAS/SHARE User's Guide, LIBNAME documentation:
LIBNAME Statement
In a client session, associates a libref (a shortcut name) with a SAS library that is located on the server for client access. In a server session, predefines a server library that clients are permitted to access.
…
SERVER=<server-node.>server-name | __port-number
specifies the location and identity of the SAS/SHARE server session that administers the SAS library.
You should request additional information about the remote server from you network / SAS administrator.
Not sure about your specific ERROR messages logged -- the log for an unsuccessful libname (on my machine) looks like:
12 libname foo 'c:\temp' server=#######.######;
ERROR: Attempt to connect to server #######.###### failed.
ERROR: A communication subsystem partner link setup request failure has occurred.
ERROR: The connection was refused.
ERROR: Error in the LIBNAME statement.
Related
PROC SQL; UPDATE CLEAN.Unknown_Provs AS A SET ProviderAddress1=(SELECT COALESCEC(Prac1stAddr,Mail1stAddr) FROM HED.tblProviderNPI AS B WHERE A.UnknownFile_ProviderKey=B.NPI) WHERE ProviderAddress1=''; quit;
HED is the libname created for the SQL server database.
How can i fix the error
When I'm trying to connect Snowflake to Power BI I get an error:
"ODBC: ERROR [HY000] [Microsoft][Snowflake] (4)
REST request for URL https://eda87722.snowflakecomputing.com:443/session/v1/login-request?requestId=9eb99320-1fc6-49ee-859b-0fca3e70638b&request_guid=5f7d40b7-7025-410c-8905-da5b7e7e78cd&warehouse=COMPUTE_WH failed: CURLerror (curl_easy_perform() failed) - code=5 msg='Couldn't resolve proxy name' osCode=9 osMsg='Bad file descriptor'."
Interesting that I'm able to establish the connection on another machine with no issues. But I can't find the reason for such issue.
What I've tried:
Established the connection with vpn on and vpn off - no difference
Checked environmental variables http_proxy and https_proxy (http://proxyserver.internal) - they exist
Checked my access to data in Snowflake - I have access
Reinstalled Power BI
Tried to install Snowflake ODBC driver, but it is not visible in the list of my drivers
Tried to connect to different warehouses in Snowflake - same issue
'Auto resume' option for my warehouse in on
Could you, please, advise what else can I do?
Thank you.
I am new to Informatica and I am getting the below error while running a simple DB to DB table load using Informatica.
Message Code: WRT_8229
Message: Database errors occurred:
FnName: Execute -- [Informatica][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Incorrect syntax near '$'.
FnName: Execute -- [Informatica][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Statement(s) could not be prepared.
FnName: Execute -- [Microsoft][ODBC Driver Manager] Function sequence error
I am not using any parameter in the mapping and I do not understand why I am getting that:
'Incorrect syntax near '$'' error.
I have another mapping that loads the same table, but that is working without any issues.
This error can occur when the target table contains a column name with a space or special character. For your case it seems its $. Now, you need to add QuotedId=Yes to your odbc.ini file for target connection entry. Check if you are using same connection as the mapping that works. If not, you need to append below parameter to your connection.
open .odbc.ini. This exists in $ODBC_INI location.
append below entry and save it.
[TGT_CONN]
...
QuotedId=Yes
Explanation - QuotedID parameter determines whether the driver uses quoted identifiers while writing/reading. So Infa should use sql like SELECT "abc_$" from "table"
I am trying to write a libname statement in SAS Enterprise Guide to refer to an existing ODBC connection to a SQL database. This program is shared with others, so coding my user name and password in it is not an option. Ideally I would like to the program to prompt me for a user name and password, but I can't get it to work. Here is what I have tried:
Works (but is not acceptable):
libname libref odbc dsn=DSN_Name uid=User_Name pwd=My_Password;
(where "libref", "DSN_Name", "User_Name", and "My_Password" are my actual values)
Does not work:
a) libname libref odbc dsn=DSN_Name;
result: ERROR: CLI error trying to establish connection: [Microsoft][SQL Server Native Client 11.0][SQL Server]Login failed for user ''.
ERROR: Error in the LIBNAME statement.
b) libname libref odbc dsn=DSN_Name prompt=yes;
result: ERROR: CLI error trying to establish connection: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver
specified : [Microsoft][ODBC Driver Manager] Invalid connection string attribute
ERROR: Error in the LIBNAME statement.
c) libname libref odbc prompt=yes;
result: ERROR: CLI error trying to establish connection: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver
specified : [Microsoft][ODBC Driver Manager] Invalid connection string attribute
ERROR: Error in the LIBNAME statement.
plus a bunch of other such combos. I feel like I am missing something, but what? Are there other options which don't involve me writing my password in and inevitably forgetting to delete it? Thank you!
Getting this error when connecting Power BI with Azure Databricks through spark build in connector:-
Details: "ODBC: ERROR [HY000] [Microsoft][DriverSupport] (1170)
Unexpected response received from server. Please ensure the server
host and port specified for the connection are correct."
I have checked many times host and port of the databrick cluster , and also tried after restarting of cluster .
Guide for the connection:-
https://docs.azuredatabricks.net/user-guide/bi/power-bi.html
Got the same problem today. I followed these instructions and it worked.
The user was not able to import SQL data Power BI and getting this error, while testing connection in ODBC was successful.
It turned out that he has old credentials stored in PowerBI, and that caused identification issues. Purging cached data sources (Power BI: Home >Edit Queries > Data source settings" resolved the issue.