Azure WebJob FileTrigger Path 'D:\home\data\...' does not exist - azure-webjobs

I've created a WebJob to read files from Azure Files when they are created.
When I run it locally it works but it doesn't when I publish the WebJob.
My Main() function is:
static void Main()
{
string connection = "DefaultEndpointsProtocol=https;AccountName=MYACCOUNTNAME;AccountKey=MYACCOUNTKEY";
JobHostConfiguration config = new JobHostConfiguration(connection);
var filesConfig = new FilesConfiguration();
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
filesConfig.RootPath = #"c:\temp\files";
}
config.UseFiles(filesConfig);
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
The function to be triggered when the file is created is:
public void TriggerTest([FileTrigger(#"clients\{name}", "*.txt", WatcherChangeTypes.Created)] Stream file, string name, TextWriter log)
{
log.WriteLine(name + " received!");
// ...
}
And the error I get when the WebJob is published is:
[08/17/2016 00:15:31 > 4df213: ERR ] Unhandled Exception: System.InvalidOperationException: Path 'D:\home\data\clients' does not exist.
The ideia is to make the WebJob to trigger when new files are created in the "clients" folder of the Azure Files.
Can someone help me?

According to your requirement, I tested it on my side, then I reproduced your issue.
Unhandled Exception: System.InvalidOperationException: Path 'D:\home\data\clients' does not exist
When publish the WebJob, the FilesConfiguration.RootPath would be set to the “D:\HOME\DATA” directory when running in Azure Web App. You could refer to the source code:
https://github.com/Azure/azure-webjobs-sdk-extensions/blob/master/src/WebJobs.Extensions/Extensions/Files/Config/FilesConfiguration.cs
As the following tutorial has mentioned, FilesConfiguration.RootPath should be set to a valid directory.
https://azure.microsoft.com/en-us/blog/extensible-triggers-and-binders-with-azure-webjobs-sdk-1-1-0-alpha1
Please check and make sure that the specified directory is existed in the Web APP which hosts in your WebJob.
Trigger when new files are created in the "clients" folder of the Azure Files via WebJob
As far as I know, there has two triggers for Azure Storage:
QueueTrigger - When an Azure Queue Message is enqueued.
BlobTrigger – When an Azure Blob is uploaded.
The new WebJobs SDK provide a File trigger which could trigger functions based on File events.
However, a file trigger could monitor file additions/changes to a particular directory, but there seems to be no trigger for monitor file additons/changes on Azure File Storage.

In Azure Environment, the "Web-Jobs" are stored in its local folder where known as "D:\home" and "D:\local" is the local folder used by the Web-hooks. I was in need to use a folder for temporary usage of downloading a file from SFTP server and again read the file from that local temporary location file and consume it in my application.
I have used the "D:\local\Temp" as the temporary folder which is created by the code after checking the folder existence, then after creating the folder the code will download a file from server and store to this location and then read from the same location and delete the file from that temporary folder.

Related

WinSCP error while performing directory Sync

I've developed a .Net console application to run as a webjob under Azure App Service.
This console app is using WinSCP to transfer files from App Service Filesystem to an on-prem FTP Server.
The job is failing with below error:
Upload of "D:\ ...\log.txt" failed: WinSCP.SessionRemoteException: Error deleting file 'log.txt'. After resumable file upload the existing destination file must be deleted. If you do not have permissions to delete file destination file, you need to disable resumable file transfers.
Herein the code snippet I use to perform the directory sync (I've disabled deletion):
var syncResult = session.SynchronizeDirectories(SynchronizationMode.Remote, localFolder, remoteFolder, false,false);
Any clues on how to disable resumable file transfers ??
Use TransferOptions.ResumeSupport:
var transferOptions = new TransferOptions();
transferOptions.ResumeSupport.State = TransferResumeSupportState.Off;
var syncResult =
session.SynchronizeDirectories(
SynchronizationMode.Remote, localFolder, remoteFolder, false, false,
transferOptions);

How do I connect to an Oracle cloud database within a C++ application?

I have a cloud database with Oracle and I'm trying to learn how to execute SQL commands within my c++ application in Windows.
Here's what I've tried.
1) Using Instant Client (OCCI)
I have tried to follow these docs posted by Oracle
I downloaded instant client, unzipped, and put it under the directory called Oracle
I downloaded the SDK for instant client and put it under the same directory
I downloaded the wallet files and put them under network/admin directory
I set the path variable to the directory Oracle/instant_client and created a user variable called ORACLE_HOME and set it to the same directory
Created a TNS_ADMIN user variable and set it to network/admin
Created an empty visual studio project and added the sdk to the additional include dependencies and added the sdk to the additional library directories under the project properties
Added the .lib files to the additional dependencies under the project properties (linker -> input)
Then I wrote this code
Environment* env;
Connection* conn;
env = Environment::createEnvironment(Environment::DEFAULT);
conn = env->createConnection ("username", "password", "(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp) (HOST=myserver111) (PORT=5521))(CONNECT_DATA = (SERVICE_NAME = bjava21)))");
env->terminateConnection(conn);
Environment::terminateEnvironment(env);
It compiles, runs, and builds the environment successfully but an error occurs when it tries to create a connection
2) Using ODBC
Downloaded the ODBC Oracle Driver
Added a new data source name to the system dsn
Tested connection successfully
Abandoned efforts as I couldn't find helpful and simple docs to help me with my project
3) Using Oracle Developer Tools for Visual Studio
I downloaded the developer tools and added them to visual studio
Connected to my Oracle database using Server Explorer
Was able to see my tables and data and modify them using the Server Explorer
Was unable to find docs or be able to add code that allowed me to execute SQL queries
Update
I have removed ORACLE_HOME and TNS_ADMIN as environment variables
I've tried to connect to my remote database using Instant Client SDK but no luck
I have tried the following and nothing has worked
createConnection("username", "password", "(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp) (HOST=myserver111) (PORT=5521))(CONNECT_DATA = (SERVICE_NAME = bjava21)))")
createConnection("username", "password", "//host:[port][/service name]")
createConnection("username", "password", "xxx_low")
createConnection("username", "password", "protocol://host:port/service_name?wallet_location=/my/dir&retry_count=N&retry_delay=N")
createConnection("username", "password", "username/password#xxx_low")
I'm able to connect to SQLPlus in a variety of ways but not in my c++ application
Error While Debugging:
Unhandled exception in exe: Microsoft C++ exception: oracle::occi::SQLException at memory location
Full Code
#include <occi.h>
using namespace oracle::occi;
int main() {
Environment* env;
Connection* conn;
env = Environment::createEnvironment(Environment::DEFAULT);
conn = env->createConnection("username", "password", "(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp) (HOST=myserver111) (PORT=5521))(CONNECT_DATA = (SERVICE_NAME = bjava21)))");
env->terminateConnection(conn);
Environment::terminateEnvironment(env);
return 0;
}
With Instant Client in general:
don't set ORACLE_HOME. This can have side effects.
you don't need to set TNS_ADMIN since you put the unzipped network files in the default directory.
For cloud:
In the app, use one of the network aliases from the tnsnames.ora file (eg. xxxx_low). You can see the descriptors have extra info that your hard coded descriptor is missing.
ODBC will be exactly the same. Once you have the wallet files extracted to the default network/admin subdirectory you just need to connect with the DB credentials and use a network alias from the tnsnames.ora file.
More info is in my blog post How to connect to Oracle Autonomous Cloud Databases.
Official doc is in Connect to Autonomous Database Using Oracle Database Tools

Unity UWP build can't find AWSCredentials in the the shared credentials file in the default location

I'm making some tests (create a bucket, upload a file, list buckets etc) with Unity-UWP and Amazon AWS.
When I play it in the Editor, every thing works fine but when I try to find my AWS credentials in the UWP build it can't find them. This is my code:
void Start()
{
chain = new CredentialProfileStoreChain();
if (chain.TryGetAWSCredentials("default", out awsCredentials))
{
client = new AmazonS3Client(awsCredentials);
Debug.Log("Credential OK");
}
else
{
Debug.Log("Credential NO OK");
}
}
So, every time, I got "Credential NO OK" and can't continue with the tests.
Could it be because UWP is very sandboxed and the user is not giving explicit access to the "credentials" and "config" file in the defaul location?
If so, what could be the solution or workaround. I wouldn't like to use my credentials in the code.
Unity Version: 2020.3.3f1
AWS SDK: Version 3.7.38 of the netstandard2.0 DLLs
Build Environment: Visual Studio 2019
Build Type: Executable Only (for fast iteration and local test)
Build Configuration: Release
Target Architecture: x64
Test Environment: UWP running on Windows 10 Desktop, build 19041.985
Api Compatibility Level: .NET Standard 2.0
Also I added a "link.xml" file for preserving my AWS .dlls, and "internet client" anabled in both "player settings" and "appxmanifest"
thank you in advance.
As a workaround for this situation, I set my credentials in the Environment variables. As this post suggest From where and in what order does the AWS .NET SDK load credentials?, the Environment variables is one option to get the aws credentials without hard coding them.
You can lear How to configure the Environment Variables here from the Amazon docs.
This doesn't explain why "TryGetAWSCredentials" from the default location doesn't work on UWP but allow me to continue with my test.
Finally my code for, e.g. list my buckets with this aproach:
public async void RegionCredentials()
{
Amazon.RegionEndpoint region = Amazon.RegionEndpoint.GetBySystemName("us-east-1");
using (var client3 = new AmazonS3Client(region))
{
var response = await client3.ListBucketsAsync();
Debug.Log("Region OK");
Debug.Log("Region - Number of buckets: " + response.Buckets.Count);
foreach (S3Bucket bucket in response.Buckets)
{
Debug.Log("Region - You own Bucket with name: " + bucket.BucketName);
}
}
}

WebSphere 8.5.5 Error: PLGC0049E: The propagation of the plug-in configuration file failed for the Web server

I just installed a new WebSphere 8.5.5 ESB on Linux Centos 7.
All installation i did with root user.
Than i did the following steps to create a Web Service:
1) create server with user wasadmin
2) Generate plugin
3) Propagate plugin
In the last step i get the error:
PLGC0049E: The propagation of the plug-in configuration file failed for the Web server. test2lsoa01-02Node01Cell.XXXXXXXXX-node.IHSWebserver.
Error A problem was encountered transferring the designated file. Make sure the file exists and has correct access permissions.
The file /u01/apps/IBM/WebSphere/profiles/ApplicationServerProfile1/config/cells/test2lsoa01-02Node01Cell/nodes/XXXXX-node/servers/IHSWebserver/plugin-cfg.xml exist.
I gave him for test chmod 777 plugin-cfg.xml
Still the error is not going away.
Can someone help?
User wsadmin would be the user attempting to move the file. Ensure that ID can access /u01/apps/IBM/WebSphere/profiles/ApplicationServerProfile1/config/cells/test2lsoa01-02Node01Cell/nodes/XXXXX-node/servers/IHSWebserver/plugin-cfg.xml and there should be a target directory as well (in the webserver installation where plugin-cfg.xml is being moved to). Ensure that wsadmin has write access to this target location if propagating using node sync. If using IHS admin, ensure that the userid/password defined in the web server definition has write access to the target location.
A good test would be to access the source plugin-cfg.xml using wsadmin userid and attempt to manually move the file to the target location with the appropriate ID (based upon use of node sync or IHS admin).

How do I test the Azure Webjobs SDK projects locally?

I want to be able to test an Azure WebJobs SDK project locally, before I actually publish it to Azure.
If I make a brand new Azure Web Jobs Project, I get some code that looks like this:
Program.cs:
// To learn more about Microsoft Azure WebJobs SDK, please see http://go.microsoft.com/fwlink/?LinkID=320976
class Program
{
// Please set the following connection strings in app.config for this WebJob to run:
// AzureWebJobsDashboard and AzureWebJobsStorage
static void Main()
{
var host = new JobHost();
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
}
Functions.cs:
public class Functions
{
// This function will get triggered/executed when a new message is written
// on an Azure Queue called queue.
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
}
}
I would like to get around to testing whether or not the QueueTrigger function is working properly, but I can't even get that far, because on host.RunAndBlock(); I get the following exception:
An unhandled exception of type 'System.InvalidOperationException'
occurred in mscorlib.dll
Additional information: Microsoft Azure WebJobs SDK Dashboard
connection string is missing or empty. The Microsoft Azure Storage
account connection string can be set in the following ways:
Set the connection string named 'AzureWebJobsDashboard' in the connectionStrings section of the .config file in the following format
, or
Set the environment variable named 'AzureWebJobsDashboard', or
Set corresponding property of JobHostConfiguration.
I ran the storage emulator, and set the Azure AzureWebJobsDashboard connection string like so:
<add name="AzureWebJobsDashboard" connectionString="UseDevelopmentStorage=true" />
but, when I did that, I'm getting a different error
An unhandled exception of type 'System.InvalidOperationException'
occurred in mscorlib.dll
Additional information: Failed to validate Microsoft Azure WebJobs SDK
Dashboard account. The Microsoft Azure Storage Emulator is not
supported, please use a Microsoft Azure Storage account hosted in
Microsoft Azure.
Is there any way to test my use of the WebJobs SDK locally?
WebJobs 2.0 now works using development storage (I'm using v2.0.0-beta2).
Note that latency in general and Blob triggers in particular are currently far better than you can get in production. Design with care.
If you want to test the WebJobs SDK locally, you need to set up a storage account in Azure. You can't test it against the Azure Emulator. That's what that error is telling you.
Failed to validate Microsoft Azure WebJobs SDK Dashboard account. The Microsoft Azure Storage Emulator is not supported, please use a Microsoft Azure Storage account hosted in Microsoft Azure.
So to answer your question, you can create a storage account in Azure using the portal, and then set up your connection string in the app.config of your Console Application. Then just drop a message to the queue and run the Console Application locally and it will pick it up (assuming you're trying to interact with the queue obviously).
Make sure that you replace the [QueueTrigger("queue")] "queue" with the name of the queue you want to poll.
Hope this helps