Location of Coldfusion connection string settings - coldfusion

This is new to me: I am looking into a ColdFusion website.
The problem is I cannot even find the connection string. In some qryXXX.cfm files, I find
<cfquery name="GetXXX" datasource="xxxx">
But I just cannot find where this datasource is stored.

The datasource is defined in the ColdFusion Administrator.
On a default developer install the CF Administrator is available at:
http://localhost:8500/CFIDE/administrator/index.cfm
(On a server, it may or not be configured differently to this.)
Once logged in, on the left hand menu, go to section "Data & Services", and the first entry is "Data Sources", which is the area you want to go in.
Within this area, you will find a list of all defined datasources - the first icon in the Actions column lets you edit/view the details.
The actual data which the CF Administrator works with is stored in {coldfusion-dir}/lib/neo-datasource.xml

in coldfusion 7 this file is called neo-query.xml
and is stored, in a jrun install, at
C:\Apps\JRun4\servers\cfusion\cfusion-ear\cfusion-war\WEB-INF\cfusion\lib
the path may differ slightly depending on your install

Related

Error provisioning namespace. ORA-20001 Request could not be processed at Oracle Apex

I finally managed to install Oracle Apex 5.1.2 but I have problem with creating a workspace. Whenever I try to do so at the end I get an error:
I tried to create this workspace with following values:
The strange thing is that when I try to use Yes as option to Reuse Existing Schema no schemas are listed. Is it possible that Apex somehow doesn't have access to managing schemas?
I am using APEX with ORDS. At home page I get info that I have 1 workspace and 1 schema.
I've tried:
Using strong passwords as mentioned here
Changing provisioning type to request: Effect is the same. If user request a space and I accept it I get the exact same error.
Enabled OMF with parameter DB_CREATE_FILE_DEST = '/u01/app/oracle/oradata' -> *.dbf files are not created before and after the change in directory.
The root cause of this problem was installing APEX both on CDB$ROOT, so as a result, and on PDB1. I uninstalled APEX from root, repaired with #utlrp.sql script as in this tutorial and installed APEX again, but only on PDB1. Workspace was successfully created.
I had the same problem (apex 18.1/ords) in a database without CDB configured. The solution in my case was to run #apex_rest_config.sql script.
After that, the workspace is created without any problem.
If you don't want to reinstall apex to move it from the CDB to the PDB I suggest you try setting PDB mapping in your ords config file.
https://docs.oracle.com/en/database/oracle/oracle-rest-data-services/20.2/aelig/configuring-REST-data-services.html#GUID-694B2F89-CE4F-4AB0-88E2-EB35D03DEC3C
I did it by adding
<entry key="db.serviceNameSuffix"></entry>
to the end of my defaults.xml (you can find its location by running
$ java -jar ords.war configdir ).
Then access apex with /yourpdb in the path: e.g.
http://server:port/ords/pdb1
This will run apex from that PDB instead of from the CDB and will create the workspace in there, that should work OK. It did for me.
I had same problem at ORACLE 12c, according to this link my problem has been solved. The problem is the users can't create workspace in CDB, so you must change session container to pdf files by the following steps :
$root> cd ~/TEMP/apex
$root> sqlplus
Enter user-name: sys as sysdba
Enter password:
SQL> exec dbms_xdb.sethttpport(0); /*set port*/
SQL> alter session set container=YOURAPPEXPDB;
SQL> exec dbms_xdb.sethttpport(8181);
SQL> alter system register;
//install oracle apex again
to remove oracle apex i use this link, its perfectly worked for me.

Sitecore Commerce Connect > Refreshing a Cache via code

I am trying to update the commerce catalog from external source. After the incremental update I need to have fresh data in Sitecore tree(data provider should return correct data instead of old(cached) ones). However, if I go to Sitecore right after the data import I can see only the old data till I click on "Refresh Catalog Cache" button in Sitecore Commerce menu.
I found the same info in the documentation for Sitecore Commerce Connect, however I can't find any example how to clean cache via code.
I found several types in "Sitecore.Commerce.Connect.CommerceServer.Caching" namespace. For example, CacheRefresh static class. It has RefreshCatalogCaches method which needs ICommerceServerContextManager contextManager as input parameter. If I create contextManager just using constructor new CommerceServerContextManager() and passing it to the method - it doesn't work(at least I still need to clean cache manually).
I would appreciate any advise/suggestion.
Thank you in advance.
You should do in your code same that happens on "Refresh Catalog Cache" button click:
CacheRefreshEvent eventX = new CacheRefreshEvent("catalogcache", "master", = ID.Null);
EventManager.QueueEvent<CacheRefreshEvent>(eventX, true, true);
For more details, look on Sitecore.Commerce.Connect.CommerceServer.Caching.RefreshCache, Sitecore.Commerce.Connect.CommerceServer implementation via reflector.

How export registered servers settings in Aqua Data Studio?

Does anyone know how to export registered servers in Aqua Data Studio? Maybe there's some tricky method to do it by copying some .ini file or registry keys?
AD Studio server registrations are in [USER_HOME]/.datastudio/connections directory. You can copy your existing connections from one machine to another.
AquaFold's documentation about copying registrations from one computer to another is here:
https://www.aquaclusters.com/app/home/project/public/aquadatastudio/wikibook/Documentation16/page/128/Configuration-Connection-files#copy
*** Note: make sure you take a backup of the .datastudio before replacing files.
To Export the connections please make sure you have a fresh installation of aqua data studio on the other system and you haven't set up any new connections.
1) Simply go to C:\Users[userName]\.datastudio
copy folders, files below and place in the same location of the new system:
C:\Users[userName]\.datastudio\connections
C:\Users[userName]\.datastudio\bigquery
C:\Users[userName]\.datastudio\pfile.properties
pfile.properties has the cipher key to decrypt passwords on the system.
I initially only copied the connections folder and found none of my passwords worked anymore. Then I added the pfiles.properties file. That fixed the password problem, but when I tried to open a Query Analyzer window on any of my many MS SQL Server registered servers, I got an error: Id 18456, Level 14, State 1, Line 1 Login failed for user '<username>'.
By copying the rest of the files and subfolders in the .datastudio folder (except for the history which I didn't need and the license files as I had to renew my license anyway), the error was cleared. Bottom line, copy the entire .datastudio folder to transfer your configuration to the new machine, as is documented in the aquaclusters Wiki link: "Copying this directory to a new computer copies all of your current ADS customizations and server registrations."

WOWZA LiveAutoRecord

I am tired of one problem so please make things clear to me.
Please read these following three points and help me out.
(1)
I have simply followed this https://www.wowza.com/docs/how-to-start-and-stop-live-stream-recordings-programmatically-livestreamrecordautorecord-example#documentation
I have attached my Application.xml. Now when I publish live stream name "test1" via FMLE it get recorded on server but when I run different instance of FMLE on different PC and publish live stream name "test2" it does not get record and I think it goes to previously recorded file "test1" (means no separate file being record, however there should be two files recorded test1 and test2).
Why this happenning ?
Is this com.wowza.wms.plugin.livestreamrecord.module.ModuleAutoRecordAdvancedExample for single stream recording ? means If I publish stream A B C D , it will record them in one single file ? (probably the output file will be A.mp4 as A was first published stream ?)
(2) What is this https://www.wowza.com/docs/how-to-start-and-stop-live-stream-recordings-programmatically-imediastreamactionnotify3#comments module for ?
I have implement this code in Eclipse and successfully put jar in lib folder and configured everything. Now again I am not able to record different streams with their corresponding name. Means If I publish stream1 and stream2 then desired output should be two different files (in content folder) but again I see one single file being record ?
(3) Can I use ModuleLiveStreamRecord.java ? This was in older version of WOWZA but I have properly imported required jar and tested it.
My requirement is very simple:
As soon as users start publishing, WOWZA should start live recording. If 10 users publishing live, 10 files should be generate.
Don't make things more difficult than necessary (assuming you have Wowza 4.x; if you still have 3.x then I highly recommend to upgrade for free)
Open the Engine Manager (http://your.server.com:8088)
Go to "Applications" from the top menu
Select your application from the left menu (e.g. "live")
In the setup window for this application, click the blue Edit button
Enable "Record all incoming streams"
Click "Save"
Click the orange "Restart now" button at the top
Done
Every stream that is published via this application will now automatically be recorded. The default folder for recordings is the /content folder in your Wowza installation. You can change this on the same page under "Streaming File Directory" (make sure it's a directory on your local system, unless you really well understand how Wowza works)
The filename is always the streamname + ".mp4", but when you start a new recording while the file already exists, the old file will be renamed first.
Want to control recording manually? Start publishing first, then select "Incoming streams" from the left menu and use the big red dot button behind a stream name to start recording.
If your server produces any different behavior with regards to the file (re)naming or recording, then you may need to review your Wowza setup.
I appreciate your response KBoek.
I sorted out issue but there were really debugging need if one doing custom module. I had to write custom module for live auto recording because I wanted HTTP authentication and then custom name of live recording.
thanks again

How do I rebuild a custom Lucene index on a Sitecore content delivery server?

The custom Lucene index on my Sitecore 6.2 Content Delivery server seems to be not right. So I think I need to rebuild all 3 of my custom indexes. How do I do that? Do I just have to use the shared source Index Viewer module? Right now I have that installed on my CD server, however for some reason it is not working. When I select my custom index in Index Viewer - nothing happens. So I can't rebuild the index that way. Can I just delete the index files from the hard drive? If so, how quickly will Lucene rebuild them?
As noted above, earlier versions of Sitecore 6.x required custom indexes to be rebuild using either IndexViewer or with some custom code. I believe in a revision of 6.5 the Control Panel > Database > Rebuild Search Indexes began including custom indexes so IndexViewer is no longer necessary (but should still work).
To your specific question though, on my CD servers I have a rebuild script that can be called directly to rebuild search indexes. I forget where I found this script (believe it was something published by Alex Shyba at Sitecore). You can find the details of this script at https://gist.github.com/Refactored/6776801
However, I believe you have a different issue that needs to be addressed. If your CD servers aren't detecting changes and therefore not updating you have a configuration issue. I would start with this article when troubleshooting index issues: http://sitecoreblog.alexshyba.com/2011/04/search-index-troubleshooting.html
I ended up contacting Sitecore support and they pointed me to the shared source module called Sitecore Support Toolbox - http://marketplace.sitecore.net/en/Modules/Sitecore_Support_Toolbox.aspx. Once I installed that I was able to easily rebuild my indexes.
Since Sitecore 6.6 update 3 or 4 (don't remember which one was it) you can rebuild your custom indexes from the Sitecore Control Panel.
In all previous versions you need to rebuild it from code or using custom modules for Sitecore. Deleting index files won't work.
The simplest code for rebuilding custom Sitecore Lucene Index is:
Sitecore.Search.SearchManager.GetIndex("your_index_name").Rebuild()
The blog post "Troubleshooting Sitecore Lucene search and indexing" can help you if rebuilding the index won't solve your problem.
I have come across the same requirement in one of my projects. Here was my solution:
Create a configuration content item with a template that has only one field, say "Rebuild Index", default value is "1", example of the item path could be: "/sitecore/content/mysite/config/index rebuild flag"
Create an IndexRebuilder class that has a Run method. Within the Run method, check the "index rebuild flag" item (from the Context database) and rebuild the index on the server if the "Rebuild Index" field value equals to "1". After rebuilt successfully, update the item field value to "0".
Set up, a scheduled agent that points to the IndexRebuilder class. For examples,
<agent type="MyAssembly.IndexRebuilder, MyAssembly" method="Run" interval="00:00:00"/>
Notice that the interval is "00:00:00" by default, to turn off the agent on content management server. Your build and deployment process should turn this value to say "00:05:00", which allow the agent to run on every 5 minutes.
From there, to rebuild index on content delivery server, just publish the "index rebuild flag" item from master database to the content delivery database (web) and the index on your content delivery server should start rebuilding in 5 minutes.
Clicking Index Viewer with nothing happening, is usually an indication of certain files of the Index Viewer package having not been deployed to your CD server. Easiest fix for this - if you do have /sitecore running on the CD server - is to just re-install the package directly on the CD server. After this, IndexViewer will work.
If you don't have a /sitecore on your CD server (Sitecore recommends removing this, or at least blocking access to it) - it becomes more problematic. I would recommend setting a page/webservice or similar, executing the code Maras suggests above - that way you can always trigger an index rebuild when you need it.