I have an Android application in the market which connects and send POST and GET queries to a REST API, and then stores the results in a DB which are then queries and displayed in an appropriate manner in the application.
I'm interested in speeding up the application and have noticed quite a lot of lag between the time of receiving the data back from the api and the data being ready to use. I'd like to investigate if and how I can write similar code in c++ using the NDK to connect to the REST API, process the results and store in a DB or raise an error. I've no previous c++ experience and need to know firstly if I can access the same DB in the C++ as the Java, secondly if there are any other caveats which I should be aware of?
Also I guess I should ask - is it worth doing this? Will I notice any difference?
Any links to similar code, or an overview of where I should look to get started in c++ would be greatly appreciated.
I'm doing the EXACT same thing, and trust me: if you have no previous C++ experience, this might be a bit too costly for little benefit.
In my case, after some profiling, I reordered things around and had an initial jump in performance only by dropping DOM and using SAX. All the rest is only making things marginally better, like processing the response while packets are still being transmitted (i.e. not wait for the full response to start processing), and multiplexing requests on the same thread instead of starting a new thread for each.
What you should be looking for in Google is POSIX sockets, HTTP and REST codes, if you wish to do it all by hand. A better option might be using CURL or something similar for the Socket/HTTP part. I did it all myself, but only because I have already done this a few times.
Related
I am new to network programming and got started with boost, REST etc. I wanted to know if I could use REST API's with boost-asio such as using Google Maps' Distance Matrix in my program. But I couldn't find a proper documentation for boost.
I don't expect you to give me complete working code rather I need idea or some sort of guidance as to what to do, where to find things etc.Also this program will be in C++ purely (I don't know if it can even be done in C++, given this answer https://stackoverflow.com/a/28736632/4846740). Thanks
Note: This post was not very helpful Integrating Google maps with C++ Program
You'd want to
use a REST library (e.g. cpprestsdk or some other frameworks like autobahn-cpp?), or
= at least write the REST requests on top of a HTTP library, such as Boost Beast
The library examples show you everything you need to send requests and receive responses. If you want, you can use additional libraries like https://github.com/0xdead4ead/BeastHttp to make it even more high-level/instant.
I would 100% recommend that you do not do this in C++. While I'm not a huge fan of Python, it's undoubtedly the hammer that is made for this nail. Check out BeautifulSoup, Mechanize, and Scrapy (+XPath), for really convenient ways of obtaining/parsing HTML, filling out web forms, and gaining responses. Typically, unless you're doing realtime target tracking, you do not need the latency gained from running everything in C/C++. You can get away with quarter-second, or even half-second updates.
I'm not sure what you're trying to do, but I would say save yourself the headache, and just work with Python.
I am using Ember 3.0 at the moment. Wrote my first lines of code in ANY language about 1 year ago (I switched careers from something totally unrelated to development), but I quickly took to ember. So, not a ton of experience, but not none. I am writing a multi-tenant site which will include about 20 different sites, all with one Ember frontend and a RubyOnRails backend. I am about 75% done with the front end, now just loading content into it. I haven’t started on the backend yet, one, because I don’t have MUCH experience with backend stuff, and two, because I haven’t needed it yet. My sites will be informational to begin with and I’ll build it up from there.
So. I am trying to implement a news feed on my site. I need it to pull in multiple rss feeds, perhaps dozens, filtered by keyword, and display them on my site. I’ve been scouring the web for days just trying to figure out where to get started. I was thinking of writing a service that parses the incoming xml, I tried using a third party widget (which I DON’T really want to do. Everything on my site so far has been built from scratch and I’d like to keep it that way), but in using these third party systems I get some random cross domain errors and node-child errors which only SOMETIMES pop up. Anyway, I’d like to write this myself, if possible, since I’m trying to learn (and my brain is wired to do the code myself - the only way it sticks with me).
Ultimately, every google result I read says RSS feeds are easy to implement. I don’t know where I’m going wrong, but I’m simply looking for:
1: An “Ember-way” starting point. 2: Is this possible without a backend? 3: Do I have to use a third party widget/aggregator? 4: Whatever else you think might help on the subject.
Any help would be appreciated. Here in New Hampshire, there are basically no resources, no meetings, nothing. Thanks for any help.
Based on the results I get back when searching on this topic, it looks like you’ll get a few snags if you try to do this in the browser:
CORS header issues (sounds like you’ve already hit this)
The joy of working with XML in JavaScript (that just might be sarcasm 😉, it’s actually unlikely to be fun)
If your goal is to do this as a learning exercise, then doing it Javascript/Ember will definitely help you learn lots of new things. You might start with this article as a jumping off point: https://www.raymondcamden.com/2015/12/08/parsing-rss-feeds-in-javascript-options/
However, if you want to have this be maintainable for the long run and want things to go quickly and smoothly, I would highly recommend moving the RSS parsing system into your backend and feeding simple data out to Ember. There are enough gotchas and complexities to RSS feeds over time that using a battle-tested library is going to be your best way to stay sane. And loading that type of library up in Ember (while quite doable) will end up increasing your application size. You will avoid all those snags (and more I’m probably not thinking of) if you move your parsing back to the server ...
I hope you are doing good and i really appreciate your help here for my query.
We have our system T3000 written in C++ (http://www.temcocontrols.com/ftp/software/9TstatSoftware.zip and codes are available here https://github.com/temcocontrols/T3000_Building_Automation_System).
I am trying to integrate 'BIRT reporting tool' in my C++ application. I want to create report based on the data available in our T3000 system. I think BIRT is embeddable (??). We don't need to compile and change the project, just need to be able to call it from T3000.exe mainly.
My thinking is we may put one menu label in existing T3000 and try to display report in user single click.
Can you please help me to solve my issue with 'BIRT' ? I really appreciate your answer.
Regards
Raju
Well, the answer depends on what your definition of "embeddable" is.
BIRT is written in pure Java.
I could think of 3 different ways:
Of course it is possible to integrate Java code into an existing C/C++ program (see Embed Java into a C++ application?).
You could just use the BIRT runtime engine and generate the report as PDF or HTML from the command line (that means, basically you call the java executable from your program with several arguments). See Birt - How to run report engine on the console? and http://eclipser-blog.blogspot.de/2008/02/automatic-generation-of-birt-reports.html for more information.
You could run a Java web server like Tomcat in a second process and then start your report by calling a http URL (e.g. you could use the included Servlet example). See http://www.eclipse.org/birt/documentation/integrating/viewer-usage.php
Similar to 3. (see below)
Some notes:
The second option is slow, due to the Java and BIRT engine startup overhead (this may take several seconds). With the first and third option, the startup overhead is or can be minimized to only once (and for each report).
For the second and third option it may be necessary to modify the existing code of the example programs to suit your needs.
The first option is probably the best for an industry-quality solution, but it is also the most difficult to develop.
Anyway, Java skills are necessary IMHO.
If you plan to run this on a SOC instead of a PC, take performance into account.
Is a Java-based solution well-suited for this kind of hardware? BIRT needs quite a lot of RAM and CPU (for a SOC). Hardware like the Raspi 3 should handle this quite easily, I reckon.
I integrated the BIRT runtime into an existing Python application (all this running on an application server) in a fourth way: I wrote a listener program that listens on a TCP socket for BIRT tasks. It uses a pool of worker processes (written in Java) which in turn use the BIRT report engine to generate the output. The client program (here: written in Python) opens a TCP connection to the listener and uses this socket to tell it which report to generate (including report parameters and destination file name). The listener program then in turn chooses a worker process for the task and gives the task to the worker process.
So, basically, this fourth option is similar to the third one, with two differences:
The communication is socket-based (instead of http), allowing bi-di communication.
The architecture is multi-processes instead of multi-threading. We choose this because very large reports could cause out-of-memory errors for otherwise unrelated reports that just happen to run at the same time. It's the same basic architecture Oracle chose for their reports server.
However, developing the programs took months.
HVB: I have to give you more than a simple thanks for the explanation above, this info will save us time I am sure. Raju will be sharing our experience after we get into the project a little deeper so others can benefit.
I am pulling out a good amount of data using webservice from a java application. The data is a bit complex in its structure with lot of hierarchical pattern using array collections. I am experiencing huge performance issue of around 15 second(in jboss and WebSphere) to get the data loaded. Time consumed is mostly while converting the service data into flex object structure. Issue gets worsen while moving to Weblogic application server. I am using axis2 framework.
Is there any way to optimize this? What can be the alternative technologies I can use instead of webserivces?
I'm afraid you may not like my answer, because it will involve a lot of refactoring. I can't think of any easy fixes.
What can be the alternative technologies I can use instead of webserivces?
You'll get best performance by using AMF remoting instead of web services. Here's an article that explains what it is and contains a benchmark that will show you that this could cut easily cut your response time in half: http://www.themidnightcoders.com/products/weborb-for-net/developer-den/technical-articles/amf-vs-webservices.html. And that benchmark is using .Net on the server side. It'll work even beter with a Java server.
Is there any way to optimize this?
You should consider refactoring the objects you pass to the client to "Data Transfer Objects" (DTO's). These are simple value objects that contain only the data necessary for the client to display. Which means: less time spent transferring data from the server to the client and less time spent converting objects to ActionScript classes.
How can you limit the work involved?
You could add a layer on the server side, that would call your existing web services, convert complex data into simple DTO's and deliver them to the client through AMF services. That way you can leave your existing code intact and still get a significant performance boost.
I have a Windows Phone 7 app that (currently) calls an OData service to get data, and throws the data into a listbox. It is horribly slow right now. The first thing I can think of is because OData returns way more data than I actually need.
What are some suggestions/best practices for speeding up the fetching of data in a Windows Phone 7 app? Anything I could be doing in the app to speed up the retrieval of data and putting into in front of the user faster?
Sounds like you've already got some clues about what to chase.
Some basic things I'd try are:
Make your HTTP requests as small as possible - if possible, only fetch the entities and fields you absolutely need.
Consider using multiple HTTP requests to fetch the data incrementally instead of fetching everything in one go (this can, of course, actually make the app slower, but generally makes the app feel faster)
For large text transfers, make sure that the content is being zipped for transfer (this should happen at the HTTP level)
Be careful that the XAML rendering the data isn't too bloated - large XAML structure repeated in a list can cause slowness.
When optimising, never assume you know where the speed problem is - always measure first!
Be careful when inserting images into a list - the MS MarketPlace app often seems to stutter on my phone - and I think this is caused by the image fetch and render process.
In addition to Stuart's great list, also consider the format of the data that's sent.
Check out this blog post by Rob Tiffany. It discusses performance based on data formats. It was written specifically with WCF in mind but the points still apply.
As an extension to the Stuart's list:
In fact there are 3 areas - communication, parsing, UI. Measure them separately:
Do just the communication with the processing switched off.
Measure parsing of fixed ODATA-formatted string.
Whether you believe or not it can be also the UI.
For example a bad usage of ProgressBar can result in dramatical decrease of the processing speed. (In general you should not use any UI animations as explained here.)
Also, make sure that the UI processing does not block the data communication.