SmtpClient.send - Could not find a part of the path - smtpclient

I'm trying to write an email to my local folder. I successfully wrote an email to my documents folder using this code:
using (var client = new SmtpClient())
{
client.UseDefaultCredentials = true;
client.DeliveryMethod = SmtpDeliveryMethod.SpecifiedPickupDirectory;
client.PickupDirectoryLocation = tempDocsPath;
client.Send(message);//Writes to the PickupDirectoryLocation
}
However, when I ported this same code to another project, it gives me this error:
System.Net.Mail.SmtpException : Failure sending mail. ---> System.IO.DirectoryNotFoundException : Could not find a part of the path 'C:\Users\josh.bowdish\source\repos\GenerateEmail\GenerateEmail\bin\Debug\net461\tempFiles\AAMkAGUyODNhN2JkLThlZWQtNDE4MS1hODM1LWU0ZDY4Y2NhYmMxOQBGAAAAAABKB1jlHZSIQZSWN7AYZH2SBwDZdOTdKcayQ5NMwcwkNT7UAAAAAAEMAADZdOTdKcayQ5NMwcwkNT7UAACn\0a5b24a5-d625-4ecd-9990-af5654679820.eml'.
I've verified that the directory it's trying to write to exists, even rewrote it to look like this:
private static string WriteEmail(MailMessage message, string messageDirectory)
{
if (Directory.Exists(messageDirectory))
{
using (var client = new SmtpClient())
{
client.UseDefaultCredentials = true;
client.DeliveryMethod = SmtpDeliveryMethod.SpecifiedPickupDirectory;
client.PickupDirectoryLocation = messageDirectory;
client.Send(message);//Writes to the PickupDirectoryLocation
}
...
}
//stuff that returns the full email path
}
It breaks on the client.Send() line with the above error. As far as I can tell the code paths are identical. I've tried writing to the same folder that the other project is working with to no avail. The only thing I can think of is it's trying to write the email file before it exists, but the other project is writing it just fine.
Can someone tell me what is generating this error?
Thanks,
~Josh

This could be a permissions problem. Ensure that the account that your application is running under has permissions to "write" to this directory. Your Directory.Exists could be passing since it is only checking if the directory is there, but failing when trying to actually write to it.

Related

Google Cloud platform terraform/terragrunt googleapi: Error 409: Requested entity already exist

I am having a strange issue when trying to push code out to our gcp repo. It fails with the following error "googleapi: Error 409: Requested entity already exists, alreadyExists" and it is referring to a project that already exists() This only occurs after i either remove another project that's no longer needed or add .bck to the terragrunt.hcl files. These projects have no dependancies on each other whatsoever.
terraform {
source = "../../../modules//project/"
}
include {
path = find_in_parent_folders("org.hcl")
}
dependency "folder" {
config_path = "../"
# Configure mock outputs for the terraform commands that are returned when there are no
outputs available (e.g the
# module hasn't been applied yet.
mock_outputs_allowed_terraform_commands = ["plan", "validate"]
mock_outputs = {
folder_id = "folder-not-created-yet"
}
}
inputs = {
project_name = "<pimsstest-project>"
folder_id = dependency.folder.outputs.folder_created # Test folder id
is_service_project = true
code push will fail with the structure in VS code is like this:
But it succeeds when like this
Some background to add. Pimsstest used to exist in a production folder under org and i moved it to test via vs code with a simple cut and paste and re push of code. I then removed the project from the console as it still existed in production. I cannot work out why the removal of another project will flag up this existing error message on pimsstest. It doesn't make any sense to me.
Across GCP a project ID can exist once only. Upon removal, it might not be instantly available again (it will always have status "scheduled for removal" - and you should receive an email, with the details of the scheduled operation). What the error message actually is trying to tell may be:
Error 409: Requested entity already STILL exist.
In theory (even if it's unlikely, when the name is unique enough), any other customer could snatch the name in the meanwhile - in which case the error message could be literally understood.

Can't upload file to aws s3 asp.net

fileTransferUtility = new TransferUtility(s3Client);
try
{
if (file.ContentLength > 0)
{
var filePath = Path.Combine(Server.MapPath("~/Files"), Path.GetFileName(file.FileName));
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName,
FilePath = filePath,
StorageClass = S3StorageClass.StandardInfrequentAccess,
PartSize = 6291456, // 6 MB.
Key = keyName,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtilityRequest.Metadata.Add("param1", "Value1");
fileTransferUtilityRequest.Metadata.Add("param2", "Value2");
fileTransferUtility.Upload(fileTransferUtilityRequest);
fileTransferUtility.Dispose();
}
Im getting this error
The file indicated by the FilePath property does not exist!
I tried changing the path to the actual path of the file to C:\Users\jojo\Downloads but im still getting the same error.
(Based on a comment above indicating that file is an instance of HttpPostedFileBase in a web application...)
I don't know where you got Server.MapPath("~/Files") from, but if file is an HttpPostedFileBase that's been uploaded to this web application code then it's likely in-memory and not on your file system. Or at best it's on the file system in a temp system folder somewhere.
Since your source (the file variable contents) is a stream, before you try to interact with the file system you should see if the AWS API you're using can accept a stream. And it looks like it can.
if (file.ContentLength > 0)
{
var transferUtility = new TransferUtility(/* constructor params here */);
transferUtility.Upload(file.InputStream, bucketName, keyName);
}
Note that this is entirely free-hand, I'm not really familiar with AWS interactions. And you'll definitely want to take a look at the constructors on TransferUtility to see which one meets your design. But the point is that you're currently looking to upload a stream from the file you've already uploaded to your web application, not looking to upload an actual file from the file system.
As a fallback, if you can't get the stream upload to work (and you really should, that's the ideal approach here), then your next option is likely to save the file first and then upload it using the method you have now. So if you're expecting it to be in Server.MapPath("~/Files") then you'd need to save it to that folder first, for example:
file.SaveAs(Path.Combine(Server.MapPath("~/Files"), Path.GetFileName(file.FileName)));
Of course, over time this folder can become quite full and you'd likely want to clean it out.

DropNet returning metadata for root folder not folder requested

Problem: GetMetaData for the folder that I need returns the root folder metadata.
Background:
I'm trying to write a small app to download a folder that is too large (many thousand files and multiple GB) to download from the Dropbox web interface. It tries to recurse through the subdirectories of the directory given, downloading all the files.
What actually happens is an endless loop. The app (incorrectly) gets the root folder metadata, iterates through the directories until it hits the directory I need and then starts working through the root directory as that is the metadata set that it receives.
The directory name "/Apps" works fine but the one I need doesn't. The folder name has an underscore and a mix of upper and lower case letters (no other characters) similar to "/XYX_DataFolder".
My app has "Full Dropbox" permission and I authorized with the account that the api key was acquired under.
Changing the directory name is not an option for me.
I'm using VS2012 and the DropNet was added through NuGet.
Any input on this issue would be welcome. Thanks!
Edit:
Runtime Version v4.0.30319
Version 1.10.23.0
As reported in the Visual Studio properties page for the reference.
I authorize which works fine and then use the code below. Some directories work fine but when I try to GetMetaData on the folder mentioned above, I get the metadata from the root folder.
private void DownloadDirectory( string serverDirectory, string clientDirectory ) {
var meta = m_client.GetMetaData( serverDirectory, false, false );
foreach ( var item in meta.Contents ) {
var destinationPath = Path.Combine( clientDirectory, item.Name );
if ( item.Is_Dir && item.Path == m_serverRootDirectory ) {
DownloadDirectory( item.Path, destinationPath );
}
else {
//var fileBytes = m_client.GetFile( item.Path );
//File.WriteAllBytes( destinationPath, fileBytes );
//textBox1.Text += Environment.NewLine + destinationPath;
}
}
}
Ok, so I downloaded the source and found my problem right away. I was missing a null for the hash in the GetMetaData call, so it was using the wrong overload. Sorry to waste your time... Thanks for the response!

Move a file after an email is sent

I am writing a program that looks for files in a folder, attaches the files to the MailMessage and sends an email using SmtpClient.
After the email is sent out successfully, I want to move the emailed files to a different folder.
I get this message "The process cannot access the file because it is being used by another process.". I tried Thread.Sleep() but did not work.
smtpClient.Send(mail);
foreach (var report in reports)
{
string source = Path.Combine(reportsFolder, report);
string destination = Path.Combine(sentReportsFolder, report);
File.Move(source, destination);
}
First, try to dispose your smtpclient class:
smtpClient.Send(mail);
smtpClient.Dispose();
http://msdn.microsoft.com/pt-br/library/system.net.mail.smtpclient.dispose.aspx
But, when creating the class, you could use an using statemant.
Like:
using (SmtpClient smtpClient = new SmtpClent()) {
//attach file
smtpClient.Send();
}
This will ensure that, after send an email, the class will releases any resources that might be locked by the class. So, you not need to call .Dispose() explicitly.
http://msdn.microsoft.com/pt-br/library/system.net.mail.smtpclient.aspx
http://msdn.microsoft.com/en-us/library/yh598w02.aspx

Upload file to SharePoint WSS 3.0 with WebRequest PUT

Hey, I've got this nice little piece of code, much like all the other versions of this method of upload using WSS WebServices. I've got one major problem though - once I have uploaded a file into my doc list, and updated the list item to write a comment/description, the file is stuck there. What I mean is that this method will not overwrite the file once I've uploaded it. Nobody else out there seems to have posted this issue yet, so .. anyone?
I have another version of the method which uses a byte[] instead of a Stream .. same issue though.
Note: I have switched off the 'require documents to be checked out before they can be edited' option for the library. No luck tho .. The doc library does have versioning turned on though, with a major version being created for each update.
private void UploadStream(string fullPath, Stream uploadStream)
{
WebRequest request = WebRequest.Create(fullPath);
request.Credentials = CredentialCache.DefaultCredentials; // User must have 'Contributor' access to the document library
request.Method = "PUT";
request.Headers.Add("Overwrite", "t");
byte[] buffer = new byte[4096];
using (Stream stream = request.GetRequestStream())
{
for (int i = uploadStream.Read(buffer, 0, buffer.Length); i > 0; i = uploadStream.Read(buffer, 0, buffer.Length))
{
stream.Write(buffer, 0, i);
}
}
WebResponse response = request.GetResponse(); // Upload the file
response.Close();
}
Original credits to: http://geek.hubkey.com/2007/10/upload-file-to-sharepoint-document.html
EDIT -- major finding .. when I call it from my nUnit test project it works fine. It seems it only fails when I call it from my WCF application (nUnit running under logged on user account, WCF app has app pool running under that same user -- my account, which also has valid permissions in SharePoint).
Nuts. "Now where to start?!", I muses to myself.
SOLVED -- I found a little bug - the file was being created in the right place, but the update path was wrong.. I ended up finding a folder full of files with many, many new versions.. doh!
Why not use the out-of-the-box SharePoint webservice, Lists.asmx? You'll find it in
http://SITEURL/___vti_bin/Lists.asmx
Edit, I checked out the link and it seems you are calling the out of the box web service. This has got be versioning related then. Can you check out the different versions that exist in the doc lib of the specific file? see if it perhaps gets added as a minor version through the service?
Have you tried using a capital T? SharePoint's webdav header processing is not very likely to be case-sensitive, but the protocol does specify a capital T. Oh, and what is the response? A 412 error code or something altogether different?