I need to write a small program that will be able to add existing email accounts to outlook so that we will be able to recieve email messages on a virtual machine. We are using outlook 2003 and outlook 2010 (A 2003 solution is more important though)
It is not possible to do by hand as it needs to be part of an automated test and setting them up manually all the time would not be feasable.
I have looked around on google and I can't seem to find any help in using the Extended MAPI so I thougt I would come here.
If anyone could help me out by then that would be great.
If you have an option using either vb script or c# that would also be useful.
What kind of account?
POP3/SMTP accounts are not MAPI based and MS did not document the parts of the IOlkAccountManager interface responsible for creating new accounts.
For Exchange accounts, use IMsgServiceAdmin::CreateMsgService("MSEMS", ...) - MSDN has a few example of how to do that. For the PST accounts, the service name will be "MSPST MS" or "MSUPST MS".
Related
As part of an automation procedure, I must copy the emails with attachments from Outlook to GCS(attachment formats should be .csv files). Can somebody advise me on how to complete this process best? Please keep in mind that I am new to GCP and that the simplest explanation would be beneficial.
Thanks in advance.
You can use REST API such as Graph API to retrieve the required data from the Office365 side and transfer it to the GCS. See Use the Microsoft Graph API for more information.
In case of dealing with Outlook (as an application installed) you can develop a VBA, COM add-in or just automate Outlook from an external application.
The simplest choice is VBA which allows to automate tasks in Outlook. VBA macros are not designed for distributing on multiple machines, that is for COM add-ins were invented. In that case you can create an installer like for any other distributable software.
The Outlook object model provides a rich set of properties and methods that allows getting the job done. You can use the Attachment.SaveAsFile method to save attached files on the disk from where you could upload them to the GCS.
I am trying to wrap my head around Microsoft Cloud for Sustainability. Apparently it's a solution based on Microsoft Dynamics. I need to have more back-end to that solution, because as it is right now I'm either lacking permissions (or extra paid access to Microsoft resources) or missing a chunk of documentation, because I'm unable to:
Change default language across the board - I can switch MS Dynamics to any language I want, but it will work for a shell only. Anything that's CfS specific, is in English. Do I remove the demo data and import my own scopes and data? As only thing available are database and Cube for BI analytics and JSON files describing CfS structure in general (that's in CDM), do I really have to create it from scratch? This brings me to second question:
Access entry-level data that's already in demo version - I need to see what's in the database the CfS is using, or be able to modify it. Is there any way to get to it via Business Central, if at all possible?
Since I will be preparing several presentations for potential customers, I need a way to quickly create a dataset based on initial and very basic information provided by each customer, how can I do that with trial user
I work for a company that's Microsoft Certified Partner, so logically resources for what I need should be available to me, but either links in the documentation are dead (and some are, as they redirect to general info) or require some special access level (or are dead, but error message is really not helpful at all).
Is there somewhere else I can go? The Documentation page offers little towards what I need...
P.S. I think I should tag this question with CfS specific tags, but not enough rep...
actually I am working with PDI 8.2, however I am able to upgrade to 9.0.
The main issue is that a customer wants to pull data from salesforce which works well so far. But he is using the Enterprise Web Services API with version 48.0, latest Pentaho supports v47.0 only.
I strongly assume that reading via v48.0 won't work with PDI so that I have to build a workaround. Could anyone point me to a feasible solution? To be honest, I don't even know whether the Enterprise or the Partner API is relevant for Pentaho. Have got my own SF-Account so that I could try around with the APIs.
Is the "Web Services lookup" the right step for the workaround?
Any answer would be appreciated! Thanks in advance!
Oh man, what a crazy question, all over the place.
I strongly assume that reading via v48.0 won't work
You'd have to try it but it should work. Salesforce has 3 releases a year and that's when they upgrade API versions. We're in Spring'20 now, it's v.48. That doesn't mean anything below is deprecated. You should have no problems calling with any API version >= 20. From what I remember their master service agreement states that API version released will stay up at least 3 years. Well, v.20 is 9 years old and still going strong...
Check for example https://baa.my.salesforce.com/services/data/ (if your client has "My Domain" enabled you can use that too instead of some unknown company), you should see a list similar to this: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_versions.htm (no login required, that'd be a chicken & egg situation. You need to choose API version you want when making the login call).
So... what does your integration do. I assume it reads or writes to SF tables (objects), pretty basic stuff. In that sense the 47 vs 48 won't matter much. You should still see Accounts, Contacts, custom objects... You won't see tables created specifically in v 48. Unless you must see something mentioned in Spring'20 release notes I wouldn't worry too much.
If your client wrote a specific class (service) to present you with data and it's written in v.48 it might not be visible when you login as v.47. But then they can just downgrade the version and all should be well. Such custom services are rarely usable by generic ETL tools anyway so it'd be a concern only if you do custom coding.
whether the Enterprise or the Partner API is relevant for Pentaho
Sounds like your ETL tool uses SOAP API. Salesforce offers 2 versions of the WSDL file with service definitions.
"Partner" is generic, all SF orgs in the world produce same WSDL file. It doesn't contain any concrete info about tables, columns, custom services written on top of vanilla salesforce. But it does contain info how to call login() or run a "describe" that gives you all tables your user can see, what are their names, what are columns, data types... So you learn stuff at runtime. "Partner" is great when you're building a generic reusable app that can connect to any SF or you want to be dynamic (some backup tool that learns columns every day and can handle changes without problems. Or there's a "connection wizard" where you specify which tables, which columns, what mapping... new field comes in - just rerun the wizard).
"Enterprise" will be specific to this particular customer. It contains everything "Partner" has but will also have description of current state of database tables etc. So you don't have to call "describe", you already have everything on the plate. You can use this to "consume" the WSDL file, generate your Java/PHP/C# classes out of it and interact with them in your program like any other normal object instead of crafting XML messages.
The downside is that if new field or new table is added - you won't know if your program doesn't call "describes". You'd need to generate fresh WSDL and consume it again and recompile your program...
Picking right one really depends what you need to do. ETL tools I've met generally are fine with "partner".
Is the "Web Services lookup" the right step
No idea, I've used Informatica, Azure Data Factory, Jitterbit, Talend... but no idea about this Pentaho thing. Just try it. If you pull data straight from SF tables without invoking any custom code (you can think of SF custom services like pulling data from stored procedures) - API version shouldn't matter that much. If you go < 41.0 I believe you won't see Individual object for example but I doubt you need to be on so much cutting edge.
High level, what I'm tying to do:
We want to create a specific email address for each of our customers on our domain name (example customer01#xyz.com). When an email is received at that address, our system will associate the attachment with that customer and process it a certain way. The email addresses will only be used for this purpose, so I don't really need a user interface or anything (although it might be nice to have for troubleshooting).
I've just started using AWS and have an overall understanding of the services. I'm planning on doing this on an EC2 instance.
I'm assuming it's possible to set up a mail server (incoming mail only) to constantly monitor all the customer specific email address and process any attachments that come in.
Where do I even start with researching this (I've Googled it but need more direction)? Here are some questions that come to mind:
1) What mail server software is best for this? Or is this even needed?
2) Is it possible to write code to monitor the incoming email for ALL email addresses simultaneously? I don't mind buying existing software if it fits our needs.
I'm a programmer myself but will not be coding this project. I'll be hiring someone from Elance but I want to at least have a general knowledge of what is needed before posting the job.
Thanks for any advice or links to helpful sites to get me in the right direction here.
Your requirement concentrates on processing the received Incoming Emails. AWS doesn't have any out of the box service / solution. Amazon Provides SES - Simple Email Service, which will help you deliver emails.
You need to build your solution on SMTP Servers like MS Exchange or Apache James etc.
This product: http://www.email2db.com/ should do what you want. You could install it on EC2 or even use their hosted edition. Not cheap, but I suspect a lot cheaper than hiring someone to write something with 1/10 the features.
There are some cloud services that provide such solution like Mailgun or Mandrill. Presonally I've only used Mailgun and it's awesome. You setup routing rules based on regexp on a domain and you can set them up to be redirected as a POST to your website. Both providers have a free tier of 10-12k mails per month wich is great.
Regards,
Marc
In an enterprise scenario where a ColdFusion application is used to register for a event, would it be possible to programmatically add an entry into the calendar of a person?
Google has given me some partial idea's like connecting via JDBC/ iCalendar, etc...
Any ideas & experiences are appreciated
Thanks!
Something I have seen in the past is that after the person has registered for the event, a link is given, something like:
Click here to add this event to your calendar
The link goes to an .ical file, which Lotus Notes (as well as Outlook, and various other calendaring clients) can use to add the event to the user's calendar. You can create an iCal file using CFiCalLib.
It's totally possible.
I've done a lot of direct Coldfusion - Notes integration in the past 12 years and the newer versions of Notes should have many integration points (scripts, agents, COM, etc) to create/edit calendar events.
Adam's angle might be good enough for you, I'd try that out. If you're looking for something to be a little more direct (between the coldfusion and notes server directly, like sharepoint), there's options as well.