How do I rename a sharePoint file to include a date using Nintex - sharepoint-2013

I'm trying to use a 2 workflows to archive any files when created or updated. The first simply moves a copy to a separate doc library. no issues
The second should rename the file once it arrives to append a date (and possible timestamp) to the end of the file so that it is a unique record.
I am trying to set a variable called Archive_Name and then setting the field value to the Archive_Name before commiting the change.
I am using this fomula to set the variable
Name-fn-FormatDate(Current Date,yyyy-MM-dd)
Both Name and Current Date are recognised variable.
When I run this the Name stays the same and does not append a date. If I run it as
fn-FormatDate(Current Date,yyyy-MM-dd)
the Name changes to my desired date proving that the formula is working, the text is being assigned to the Archive_Date variable and the variable is being applied to the field value.
What am I doing wrong?

I believe you need to concatenate the two variables. The & operator can be used in place of the CONCATENATE function. Thus; Name&fn-FormatDate(Current Date,yyyy-MM-dd)
Hope this helps

Related

How can we use advanced date filter(SYSDATE) in Source while reading the data in informatica iics?

I am unable to use advanced filters in the source of Informatica before reading the data. I want to compare a field with SYSDATE so I am going to advanced filters in Source there is SYSDATE a system variable predefined in Informatica so I am equating tablename.field=SYSDATE or tablename.field=$$SYSDATE or tablename.field=$$SYSDATE by none of them is working. Here is the screenshot of Source filter definition.
Please help how can I compare field with SYSDATE
You need to define a parameter, e.g. SYSDATE and then refer to parameter value by putting $$SYSDATE in the filter - almost like you do, but there has to be a parameter declared and value defined for this parameter. Otherwise this is just a comparison to string SYSDATE, not the desired date value.

Loop over all the files from a folder and import one that match regex criteria in ssis

I have a particular task that i need to get done but i find difficult to find any matching case on the internet.
In the company i work for, we have a VPN that every day folders named with the current date are dropped.
I need to create an ETL (in SSIS) which will loop over all the files from specific folders and extract one file that i need and then populate a table.
The name of the particular file changes every day. It holds the same n first characters and ends with a date that may be one day before the current date or two or three.
It's straightforward that i need to use a foreach loop container to loop over all the files of the folder. But how can i select the one file that starts with specific characters?
Essentially, does anybody know how can I use regular expression in a connection in SSIS?
Thank you,
SSIS for each loop container accepts * operator. So for my case, my files are a1,a2,a3,a4 etc. a* then read them sequentially. You just need to capture it in a variable and use it inside loop.
Try like this below :
You could use a Script Task for this, and use the System.IO namespace to find the file you need. You could use System.IO.Directory.EnumerateFiles to loop over the files, or, as you know what the filename will look like, you could check for the existence of the file with yesterday's date, then the day before, etc. in a loop (going back as far as you wish to), and then set a variable with the file path once you find one that exists. You could then use this variable as the Connection Manager's Connection String, setting it in the Connection Manager's Properties (Expressions).

How to create a new attribute with a default value in Rapid miner?

I am new to "Rapid miner" tool. There are two data set in my process. What I want to do is, generate a process which does the following:
To create this process should use Generate Attribute, Append and type conversion operators in RapidMiner
The first data set has a car name attribute, whereas the second data set has a name attribute. name should be renamed to car name.
The second data set has an additional other attribute which is not present in the first data set. Update the first data set to add an additional other attribute, with a default value of 1. This attribute should also have a type of Integer.
Append the modified second data set to the modified first data set
Export the new data to a new excel spreadsheet
I found the solution. Hope it will help for others
Please use below process flow
http://i.stack.imgur.com/omfDe.png

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

SP 2013 - Quick edit with Managed Meta Data columns, copy and paste from excel

I'm trying to migrate a meta data from an excel spreadsheet to a SP 2013 document library. The columns are managed meta data columns with pre defined terms matching the data in the excel spreadsheet.
However I cannot copy and paste data from excel via Quick Edit in the doucment library without getting the following error "The data returned from the tagging UI was not formatted correctly"
This happens even when I remove all formatting or paste to notepad first.
Are there any simple solutions to this issue?
http://i.imgur.com/1bqpMPA.jpg
Thanks,
Any metadata fields are in fact foreign keys, as it were, to a dynamic, hidden table (or 'list', whatever you want to call it) within SharePoint. To paste a value into a metadata column, you need to know your element's guid (as in, within the term set) and then append that to each metadata element you're pasting in as a <name>|<guid> pair.
Getting the GUID for an element within your term set
Browse to [site-root]/TaxonomyHiddenList/AllItems.aspx and create a new view (or edit the default one) to display the field 'IdForTerm'.
Where you have a term 'apple', your IdForTerm may look like '1288beaf-82e0-4d81-b9de-ad5ad8382938'. Take a note of the guid for each term which appears within your input data.
Edit your input to correctly reference each term
Let's say you're importing your data from an Excel spreadsheet. Or from a CSV. It doesn't really matter. What you need to do is, basically, a find and replace down each managed metadata column, replacing 'term' with 'term|guid'. So our example from earlier, with the apple, would become 'apple|1288beaf-82e0-4d81-b9de-ad5ad8382938'.
Finally, assuming your view is set up in exactly the same order as your input data, you should be able to 'edit list' from within the browser, hit the leftmost side of your first input row (to select the entire row) and CTRL+V all of your data at the same time.
Note there appears to be a limit to the number of entries you can make at the same time. It appears to sit at around 5,000 elements.
Adding on to #rmacd's answer, you can also get the GUID for a given MMS term by first manually entering the value(s) you need in a Quick Edit cell, then copy and paste the same value(s) from SharePoint to Excel. The pasted value will appear with the full term|guid that you need to complete the bulk copy/paste.