Web application with specific architecture - web-services

In this question i want to know the best way which helps me build web application with next requirements:
We have one DB with confedential information(for example it has table "Customers"). It should locate on customer's territory.
We have web application(It is remote for customer).In this application concrete info(for example, customer's name) shouldn't left customer's territory(hardware, software).My first solution: create local web application. When my remote app get query's result from local db, it will be only objects' id. Then my remote app will redirect this to the local app, and local app will send query with IDs as parameters to local DB. And then local app will get necessary information. I don't know whethe it is possible, but I suppose that's very bad architecture.And second variant - information encoding.Maybe there are another ways. I will be happy to hear your advices. Thank you.

Related

Running multiple sites in one CF instance

I'm running 2 sites (Dev and QA) under one instance of CF 2018. When I send a REST call to a CFC on Dev, the QA instance gets confused and starts delivering pages from the Dev site, probably due to Application or Session vars being changed.
I verified that the REST call is using the session vars from Application.cfm for the dev environment, as it should. But the QA instance somehow gets switched over to the Dev folder and starts running cfm modules from there. I can't find where the QA pages are getting redirected.
Thanks in advance, for any ideas on things to look at.
Both DEV and QA are sharing the same application name and therefore the same application and session variables. You're switching context between requests, based on which domain / environment made the previous request.
In addition, you should convert your Application.cfm to Appliction.cfc and refactor how and when you're defining application and session variables. It feels like the CFM is just setting those values on every request instead of only once, which is why the different requests are switching context.
http://www.learncfinaweek.com/course/index/section/Application_cfc/item/Application_cfc/
You need to run
each environment on their own CF instance
the application name should be dynamic
each instance should point their data sources to their own database.
Isolating the instances allow you to track separate logs based on environment.
Dynamic application names isolates the application's shared scopes from each other.
The data should be separated per environment to isolate in-development DB related changes from QA testing.
https://coldbox.ortusbooks.com/getting-started/configuration/bootstrapper-application.cfc
Your code is in source control. You checkout a copy for DEV and one for QA, each to their own "web root".
/code/DEV/repo
/code/QA/repo
In your Application.cfc:
this.name = hash( getCurrentTemplatePath() );
This creates a dynamic application name based on the current folder path, so DEV and QA will be different and easily isolate the shared scopes. Also helps with local DEV efforts.
It's a bit of work, but there's no way to shortcut this.

How can I make a SQL application work offline?

I'm making a c++/Qt application. It connects to a small online database to find different information. I need to make sure that the application works offline. So I would like to do the following
On start up of application:
- Check if internet connection is available
- if available connect to online database, download database to local (for next time no internet is available)
- if not available connect to the kocally stored version of the database
My problem is I can't find a simple solution how to "download" the database. The user will not update the database, so there is no need for syncing when online again, just the ability to download the newest version of the database, whenever online. It is a MS SQL server that the application uses.
My only idea for a solution is to have an SQLite db in the application, and then write a script that clears the SQLite database and then puts everything from the online server into it, but this requires that I write a script that goes through all of the databse. There must be a better solution. I'm also not sure how this solution should work if the database structure changes. A solution for this could just be to send out a update for the application if the structure changes with a new SQLite db with the new structure.
I tried searching for a solution, but I could not find anything that are simple. Since I don't neew syncing back and forth, I thought there must be a simple solution. Any help pointing me in the right direction is appreciated.

SOAPUI ability to switch between database connections for test suite

A bit of background:
I have an extensive amount of SOAPUI test cases which test web services as well as database transactions. This worked fine when there were one or two different environments as i would just clone the original test suite, update the database connections and then update the endpoints to point to the new environment. A few changes here and there meant i would just re-clone the test cases which had be updated for other test suites.
However, I now have 6 different environments which require these tests to be run and as anticipated, i have been adding more test cases as well as changing original ones. This causes issues when running older test suites as they need to be re-cloned.
I was wondering whether there was a better way to organise this. Ideally i would want the one test suite and be able to switch between database connections and web service endpoints but have no idea where to start with this. Any help or guidance would be much appreciated.
I only have access to the Free version of SOAPUI
This is what the structure currently looks like:
Here is how I would go to achieve the same.
There is an original test suite which contains all the tests. But it is configured to run the tests against a server. Like you mentioned, you cloned the suite for second data base schema and changed the connection details. Now it is realized since there are more more data bases need to test.
Have your project with the required test suite. Where ever, the data base server details are provided, replace the actual values with with property expansion for the connection details.
In the Jdbc step, change connection string from:
jdbc:oracle:thin:scott/tiger#//myhost:1521/myservicename
to:
jdbc:oracle:thin:${#Project#DB_USER}/${#Project#DB_PASSWORD}#//${#Project#DB_HOST}:${#Project#DB_PORT}/${#Project#DB_SERVICE}
You can define the following properties into a file and name it accordingly. Say, the following properties are related to database hosted on host1 and have the details, name it as host1.properties. When you want to run the tests against host1 database, import this file at project level custom properties.
DB_USER=username
DB_PASSWORD=password
DB_HOST=host1
DB_PORT=1521
DB_SERVICE=servicename
Similarly, you can keep as many property files as you want and import the respective file before you run against the respective db server.
You can use this property file for not only for database, but also for different web services hosted on different servers such as statging, qa, production without changing the endpoints. All you need is set the property expansion in the endpoint.
Update based on the comment
When you want to use the same for web services, go to the service interface -> Service Endpoints tab and then, add a new endpoint ${#Project#END_POINT}/context/path. Now click on the Assign button. Select All requests and Test Requests from drop down. And you may also remove other endpoints
Add a new property in your property file END_POINT and value as http://host:port. This also gives you advantage if you want to run the tests agains https say https://host:port.
And if you have multiple services/wsdls which are hosted on different servers, you can use unique property name for each service.
Hope this helps.

Deploying Django as standalone internal app?

I'm developing an tool using Django for internal use at my organization. It's used to search and tag documents (using Haystack and Solr), and will be employed on different projects. My team currently has a working prototype and we want to deploy it 'in the wild.'
Our security environment is strict. Project documents are located on subfolders on a network drive, and access to these folders is restricted based on users' Windows credentials (we also have an MS SQL server that uses the same credentials). A user can only access the projects they are involved in. Since we're an exclusively Microsoft shop, if we want to deploy our app on the company intranet, we'll need to use an IIS server to deal with these permissions. No one on the team has the requisite knowledge to work with IIS, Active Directory, and our IT department is already over-extended. In short, we're not web developers and we don't have immediate access to anybody experienced.
My hacky solution is to forgo IIS entirely and have each end user run a lightweight server locally (namely, CherryPy) while each retaining access to a common project-specific database (e.g. a SQLite DB living on the network drive or a DB on the MS SQL server). In order to use the tool, they would just launch an all-in-one batch script and point their browser to 127.0.0.1:8000. I recognize how ugly this is, but I feel like it leverages the security measures already in place (note that never expect more than 10 simultaneous users on a given project). Is this a terrible idea, and if so, what's a better solution?
I've dealt with a similar situation (primary development was geared toward a normal deployment situation, but some users have a requirement to use the application on a standalone workstation). Rather than deploy web and db servers on a standalone workstation, I just run the app with the Django internal development server and a SQLite DB. I didn't use CherryPy, but hopefully this is somewhat useful to you.
My current solution makes a nice executable for users not familiar with the command line (who also have trouble remembering the URL to put in their browser) but is also relatively easy development:
Use PyInstaller to package up the Django app into single executable. Once you figure this out, don't continue to do it by hand, add it to your continuous integration system (or at least write a script).
Modify the manage.py to:
Detect if the app is frozen by PyInstaller and there are no arguments (i.e.: user executed it by double clicking it) and if so, then run execute_from_command_line(..) with arguments to start the Django development server.
Right before running the execute_from_command_line(..), pop off a thread that does a time.sleep(2) (to let the development server come up fully) and then webbrowser.open_new("http://127.0.0.1:8000").
Modify the app's settings.py to detect if frozen and change things around such as the path to the DB server, enabling the development server, etc.
A couple additional notes.
If you go with SQLite, Windows file locking on network shares may not be adequate if you have concurrent writing to the DB; concurrent readers should be fine. Additionally, since you'll have different DB files for different projects you'll have to figure out a way for the user to indicate which file to use. Maybe prompt in app, or build the same app multiple times with different settings.py files. Variety of a ways to hit this nail...
If you go with MSSQL (or any client/server DB), the app will have to know the DB credentials (which means they could be extracted by a knowledgable user). This presents a security risk that may not be acceptable. Basically, don't try to have the only layer of security within the app that the user is executing. The DB credentials used by the app that a user is executing should only have the access that the user is allowed.

Good way to deploy a django app with an asynchronous script running outside of the app

I am building a small financial web app with django. The app requires that the database has a complete history of prices, regardless of whether someone is currently using the app. These prices are freely available online.
The way I am currently handling this is by running simultaneously a separate python script (outside of django) which downloads the price data and records it in the django database using the sqlite3 module.
My plan for deployment is to run the app on an AWS EC2 instance, change the permissions of the folder where the db file resides, and separately run the download script.
Is this a good way to deploy this sort of app? What are the downsides?
Is there a better way to handle the asynchronous downloads and the deployment? (PythonAnywhere?)
You can write the daemon code and follow this approach to push data to DB as soon as you get it from Internet. Since your daemon would be running independently from the Django, you'd need to take care of data synchronisation related issues as well. One possible solution could be to use DateTimeField in your Django model with auto_now_add = True, which will give you idea of time when data was entered in DB. Hope this helps you or someone else looking for similar answer.