i did found a file named XXX.fbl7 which is type is named "FinalBuilder Log File".
if this is the file how can i open it?
once i click it from windows explorer i get "The project file specified on the command line was not found or invalid"
In the desktop application, Tools->Options->Logging section there is tab that allows you to export the logs to text, xml and HTML.
In a Final Builder Project, you can also use a Final Builder Action to create a separate log file in the format you want.
Personally I just use the Final Builder Server Notification options to let me know when things have failed, then go to the web server and review the full log in the web page.
Final Builder log files are binary files. You can't open them without having the product installed. As far as I remember, there was an option to export them as text files. But I haven't used FB for years, so I might be wrong.
Related
I have created a Pub Sub Function in the Console and I want to upload a folder with my project using the console and not using terminal, every time I have an update.
I use Python.
In the Docs they say I can find a button to upload ZIP, but there is nothing like this.
https://cloud.google.com/functions/docs/deploying/console
Question is :
How do I upload my project from Console ? I can see the default source code in console.
Do I need to call my entry file main.py or index.py ?
Do I need to set up requirement.txt file by myself? I can't see it in my project in my machine.
You have to click 'edit' button to edit the Function, then in the 'Source' tab, left to the source, there is a drop down, where you can see "Upload Zip".
Doing this in the Terminal seems to be easier :
sudo gcloud functions deploy Project_name
I have created a simple mapping to convert file from one format to another. Power Center 9.5.1 client is running on Windows machine. I am NOT able to set full "Source filename" correctly in Workflow Session.
When I set it to ""c:\temp\people.csv", somehow slashes get converted to backslashes, resulting in file reading error. "READER_1_1_1 FR_3000 Error opening file ["c:/temp/people.csv"]."
I tried URL format ""file:\c:\temp\people.csv", but it did not work either. Also "c:/temp/people.csv" does not work.
Please note, I tried using both "Source file directory" and "Source filename", but slash conversion still takes place, resulting in error.
Any suggestions? IS there any setting to keep paths as is, windows style. TIA
It may not be the slash issue. It seems you're refering to local path, where your Informatica Client is installed. Please note, mappings get executed on Informatica Integration Service and you need to point to a path accessible by the server.
I'm new to Alexa Skill development and I'm sure this issue is process/environmental due to lack of experience.
Whenever I try to use a sample from an offical Alexa tutorial, I can never get the skill to pass the first TEST - always getting an error :(
In this case I am trying to run and fiddle with this tutorial:
https://developer.amazon.com/blogs/post/TxHGKH09BL2VA1/New-Alexa-Skills-Kit-Template-Step-by-Step-Guide-to-Build-a-Decision-Tree-Skill
What is happening / What I've done:
I download the Node SDK from the Git link, I also download the sample from the Git link. I then create a new ZIP that contains the sample code with the Node SDK included in the path /src/alexa-sdk/
I go to AWS and create a new function, not using a blueprint. I 'author from scratch' and create a function with the Skills Kit as a trigger. I name the function and use Node 6.10 runtime.
I upload my ZIP file and leave all boxes default, for Role I choose Custom Role then pick Basic Execution from the Role screen.
I leave the rest blank, go to NEXT and CREATE.
The function is created okay, but I do see this error 'This function contains external libraries. Uploading a new file will override these libraries.'
Here's the problem - this is the point of failure on all tutorials I've tried so far. I go to Configure Test Event, I choose ALEXA START SESSION as the template and click Save And Test...
EXECUTION RESULT FAILED:
{
"errorMessage": "Cannot find module '/var/task/index'",
"errorType": "Error",
"stackTrace": [
"require (internal/module.js:20:19)"
]
}
Here's something from associated error logs, unsure if it's useful:
Unable to import module 'index': Error
at Module.require (module.js:497:17)
at require (internal/module.js:20:19)
I have noticed two things that I suspect may be an issue:
1) When I go to the CODE tab for this function, I see this message:
Your Lambda function "testprojectx" cannot be edited inline since the file name specified in the handler does not match a file name in your deployment package.
2) When I look at the code that's inserted into the test when I choose ALEXA SESSION START, I see many instances of 'unique value here':
amzn1.echo-api.session.[unique-value-here]
Although, there is no mention of this in the tutorial link I am referencing.
I'm really downhearted about it now as this is like the 3rd tutorial code I've tried to configure. Can anybody with experience follow the steps I've taken and point me in the right direction.
Thank you SO MUCH in advance if so.
EDIT: Absolute Clarification on how I am creating the ZIP file
I'm using Windows 10 and Chrome to download the files from GitHub.
I download the skill-sample-nodejs-decision-tree-master ZIP file from GitHub,
I do not know how to use NPM so I do this simply via downloading to desktop.
I then download the alexa-skills-kit-sdk-for-nodejs-master.ZIP file to desktop.
I unzip the contents of decision-tree-master into a folder on the desktop also called alexa-skills-kit-sdk-for-nodejs-master.
Within this folder, I navigate to /src/ and create a new folder called 'node_modules' within /src/.
Within /src/node_modules/ I now create another new folder called 'alexa-sdk'.
I unzip the contents of alexa-skills-kit-sdk-for-nodejs-master.zip into /src/node_modules/alexa-sdk/.
I have tried two approaches from here - both fail:
1) I ZIP only the contents of /src/ (not including the /src/ folder itself) and upload to Amazon.
2) I ZIP the entire 'decision-tree-master' folder and upload to Amazon.
I must be missing something, as I said this is just one of many Alexa tutorials I've tried to get working and this always happens :( So disheartened now.
This is common issue I have seen in many posts. Most of the cases it is the way zipping the files making the problem. Instead of zipping the folder you have to select all files and zip it like below,
Is there a way to purge the log in Sitecore such that logs are written immediately. It's for production debugging.
Also strolling through log files, there are number of log files e.g. log.date.text and log.date.time.txt. Which one is the latest i.e. with our without time.
You can use next module for production server if you have remote access there :
https://marketplace.sitecore.net/Modules/S/Sitecore_Log_Analyzer.aspx
Other option is to use this module:
https://marketplace.sitecore.net/Modules/S/Sitecore_ScriptLogger.aspx
The log with no timestamp in the file name is the first on for that day.
A new log file is created each time the application pool restarts.
If you haven't changed any of the default log4net settings then the initial log file will be in the format log.yyyyMMdd.txt, each subsequent restart will cause a new file to be generated with the following format log.yyyyMMdd.HHmmss.txt.
The latest log file for the day will be the file with the latest timestamp.
I am tired of one problem so please make things clear to me.
Please read these following three points and help me out.
(1)
I have simply followed this https://www.wowza.com/docs/how-to-start-and-stop-live-stream-recordings-programmatically-livestreamrecordautorecord-example#documentation
I have attached my Application.xml. Now when I publish live stream name "test1" via FMLE it get recorded on server but when I run different instance of FMLE on different PC and publish live stream name "test2" it does not get record and I think it goes to previously recorded file "test1" (means no separate file being record, however there should be two files recorded test1 and test2).
Why this happenning ?
Is this com.wowza.wms.plugin.livestreamrecord.module.ModuleAutoRecordAdvancedExample for single stream recording ? means If I publish stream A B C D , it will record them in one single file ? (probably the output file will be A.mp4 as A was first published stream ?)
(2) What is this https://www.wowza.com/docs/how-to-start-and-stop-live-stream-recordings-programmatically-imediastreamactionnotify3#comments module for ?
I have implement this code in Eclipse and successfully put jar in lib folder and configured everything. Now again I am not able to record different streams with their corresponding name. Means If I publish stream1 and stream2 then desired output should be two different files (in content folder) but again I see one single file being record ?
(3) Can I use ModuleLiveStreamRecord.java ? This was in older version of WOWZA but I have properly imported required jar and tested it.
My requirement is very simple:
As soon as users start publishing, WOWZA should start live recording. If 10 users publishing live, 10 files should be generate.
Don't make things more difficult than necessary (assuming you have Wowza 4.x; if you still have 3.x then I highly recommend to upgrade for free)
Open the Engine Manager (http://your.server.com:8088)
Go to "Applications" from the top menu
Select your application from the left menu (e.g. "live")
In the setup window for this application, click the blue Edit button
Enable "Record all incoming streams"
Click "Save"
Click the orange "Restart now" button at the top
Done
Every stream that is published via this application will now automatically be recorded. The default folder for recordings is the /content folder in your Wowza installation. You can change this on the same page under "Streaming File Directory" (make sure it's a directory on your local system, unless you really well understand how Wowza works)
The filename is always the streamname + ".mp4", but when you start a new recording while the file already exists, the old file will be renamed first.
Want to control recording manually? Start publishing first, then select "Incoming streams" from the left menu and use the big red dot button behind a stream name to start recording.
If your server produces any different behavior with regards to the file (re)naming or recording, then you may need to review your Wowza setup.
I appreciate your response KBoek.
I sorted out issue but there were really debugging need if one doing custom module. I had to write custom module for live auto recording because I wanted HTTP authentication and then custom name of live recording.
thanks again