Programming Arduino DUE without IDE (Linux) - c++

Is there any current, relatively simple way to compile and upload full .c/.cpp files for the Arduino DUE on Linux?
I'm beginning to regularly run into issues using the boilerplate code they provide around the sketches and so far, there is very little in the way of documentation or alternative IDE support for the arduino 1.5 SDK... That and the official 1.0.5 IDE is hopelessly broken for linux right now (serial port issues among other things).

There is a great example here.
He explains what you need and how to use it to be able to upload to the due from the terminal of a linux box.
He has done a great job in helping you set up an environment to compile and upload your c programs onto the SAM3X8E. He even gives you a makefile and sample code. What more could you ask for?
Give it a try, see if it works for you.

Even though you can program in c/c++ for the arduio, the arduino does not "use" c/c++ code alone per se. When you use the IDE for arduino, a few libraries are linked when compiled to give you the "arduino" functions like setup(), and loop() as well as constants such as HIGH and LOW. The arduino language is based off a language called Processing which is written in c.
If you are having troubles with the Arduino IDE, it might help to download an older version. Check out the Previous Releases page on their website.
If that still doesn't work, you could try to build it from the source code. https://code.google.com/p/arduino/wiki/BuildingArduino

I wanted to do the same thing as I really don't like IDE like Eclipse. And I didn't want to rely on Arduino environment. Just something minimalist under ubuntu.
For libraries, I downloaded the ASF (Atmel Software Framework) here http://www.atmel.com/tools/avrsoftwareframework.aspx
For compiling, I installed gcc-arm-embedded from here https://launchpad.net/~terry.guo/+archive/ubuntu/gcc-arm-embedded. This provides gcc-arm-none-eabi.
The last part is 'bossac', the upload tool (just do an apt-get install bossa-cli).
Then, just use Vim or your fav editor, adapt a config.mk (configuration of the ASF's MakeFile) for your own project and once okay, upload the .bin to your board with bossac.
Note that bossac has to be run as root (sudo) if you want it to detect your usb port (/dev/ACM0).
After few days playing with examples provided by the ASF library, you'll be able to use a small subset of headers files (only basic definitions like registers names, bits function names in registers ...) to help you and do all the rest by yourself. I personally even don't use 'drivers' files anymore. I directly access registers with my own methods to get smaller code.

Related

Using tensorflow in C++ on Windows

I know there are ways of using Tensorflow in C++ they even have a documentation for it but I can seem to be able to get the library for it. I've checked the build from source instructions but it seems to builds a pip package rather than a library I can link to my project. I also found a tutorial but when I tried it out I ran out of memory and my computer crashed. My question is, how can I actually get the C++ library to work on my project? I do have these requirements, I have to work on windows with Visual Studio in C++. What I would love to is if I could get a pre-compiled DLL that I could just link but I haven't found such a thing and I'm open to other alternatives.
I can't comment so I am writing this as an answer.
If you don't mind using Keras, you could use the package frugally deep. I haven't seen a library myself either, but I came across frugally deep and it seemed easy to implement. I am currently trying to use it, so I cannot guarantee it will work.
You could check out neural2D from here:
https://github.com/davidrmiller/neural2d
It is a neural network implementation without any dependent libraries (all written from scratch).
I would say that the best option is to use cppflow, an easy wrapper that I created to use Tensorflow from C++ easily.
You won't need to install anything, just download the TF C API, and place it somewhere in your computer. You can take a look to the docs to see how to do it, and how to use the library.
The answer seems to be that it is hard :-(
Try this to start. You can follow the latest instructions for building from source on Windows up to the point of building the pip package. But don't do that - do this/these instead:
bazel -config=opt //tensorflow:tensorflow.dll
bazel -config=opt //tensorflow:tensorflow.lib
bazel -config=opt tensorflow:install_headers
That much seems to work fine. The problems really start when you try to use Any of the header files - you will probably get compilation errors, at least with TF version >= 2.0. I have tried:
Build the label_image example (instructions in the readme.md file)
It builds and runs fine on Windows, meaning all the headers and source are there somewhere
Try incorporating that source into Windows console executable: runs into compiler errors due to conflicts with std::min & std::max, probably due to Windows SDK.
Include c_api.h in a Windows console application: won't compile.
Include TF-Lite header files: won't compile.
There is little point investing the lengthy compile time in the first two bazel commands if you can't get the headers to compile :-(
You may have time to invest in resolving these errors; I don't. At this stage Tensorflow lacks sufficient support for Windows C++ to rely on it, particularly in a commercial setting. I suggest exploring these options instead:
If TF-Lite is an option, watch this
Windows ML/Direct ML (requires conversion of TF models to ONNX format)
CPPFlow
Frugally Deep
Keras2CPP
UPDATE: having explored the list above, I eventually found the following worked best in my context (real-time continuous item recognition):
convert models to ONNX format (use tf2onnx or keras2onnx
use Microsoft's ONNX runtime
Even though Microsoft recommends using DirectML where milliseconds matter, the performance of ONNX runtime using DirectML as an execution provider means we can run a 224x224 RGB image through our Intel GPU in around 20ms, which is quick enough for us. But it was still hard finding our way to this answer

How to develop a cross-platform C++ project?

I'm a C++ beginner and I'm starting to develop my first cross-platform C++ project. I need to use platform-specific calls (Win32 and POSIX) so I need to compile frequently both in Windows and Linux.
Whit single-platform projects I'm using, until now, KDevelop in Linux and Visual Studio 2012 in Windows.
How can I use two different IDEs in two different Operating Systems with the same project?
Should I use a single, cross-platform, IDE?
Should I learn CMake (or similar) and configure it to work with both IDEs?
Could/Should I host my code in the web and sync automatically with offline projects?
Alternatives?
Thanks in advance to everyone.
EDIT:
Just for clarification, the project will be a simple server for a scholastic protocol. There will be a client asking for upload/retrieve some files to/from the server.
With scholastic I mean that, for example, I have to use pthreads/win32 threads instead of an higher level C++ threads library.
Maybe - really depends on what you feel most comfortable with. As the project is non-graphical, all the IDE gives you is editing of files and compilation. So you can build the project on one machine with the IDE there, and then move the sources to another machine for compiling there.
I personally would just have two makefiles, one for Linux and one for Widnows. Makes life fairly simple [you could have a "outer" makefile that picks the right one based on some clever method].
Yes, you should find a version control system that works for both Windows and Linux (git, mercurial, subversion, bazaar and several others). That way, not only do you have a central repository [you can use either of your machines as "server" for any of these], but it also allows you to keep track of your changes. Definitely worthwile doing!
There are hundreds of different alternatives. But the simpler you keep it, and the less complicated your tools are, the more time you get to spend on actually programming your project, rather than, for example, figure out why CMake isn't building the way you want it to.
Also, make sure that you separate out all your system-specific code to one file per architecture. That way, it's easy to port to another architecture later, and it makes MOST of your code compile on both systems.
Typically, it's easy to adjust the IDE-specific project/build files to added/moved/deleted source files. Therefore, using a cross-platform IDE isn't that important.
You can do that, I think that CMake can also create project files for some IDEs that can then be used to build the project.
Ahem, if you want to host it online or not is your choice. What you should definitely do is to use some kind of version control. A bug-tracking system is also helpful. If you want to open-source the code anyway, using one of the existing hosting facilities is a clear yes.
Not really.
One comment though: You will have much more trouble making the C++ code portable. Building on top of a toolkit like Qt is a tremendous help. If you want to stay closer to standard C++, at least consider using Boost for stuff like threads, smart pointers, filesystem access. Good luck!
My recent experience suggest to take a look at Qt. The IDE (QtCreator) it's very good, and available on all major platforms.
I've used for a fairly simple project, that uses fairly complex components, like OpenCV and ZBar. I develop on Linux, copy the source to Windows, and recompile.
I had some trouble to setup OpenCV on both platforms, so I can't say it's super easy, but it's working. Since you already know KDevelop, you should already know Qt.
I also put much value in recent trend that see Qt5 as the platform for Ubuntu on smartphones. I really hope to see this developing.
HTH
How can I use two different IDEs in two different Operating Systems with the same project?
Should I use a single, cross-platform, IDE?
No. I think this is a case of asking the wrong question. To make a cross-platform project, what matters is your build scripts and the system-neutral nature of your code. Sometimes it might help to have project files for your preferred IDE, but maintaining multiple project files for multiple IDEs will only make things more difficult and complex for you. Instead, you should focus on finding a build system that minimizes the amount of time you spend on project maintenance.
For that, CMake and PreMake seem to be two of the best tools to make that happen.
There are dozens of alternatives (like SCons, Cook, kbuild, Jam and Boost Jam, and many others), but since CMake and PreMake both generate project files and build scripts, they might be the best solutions.
Your mileage will vary.
Could/Should I host my code in the web and sync automatically with offline projects?
You should have robust source control that works everywhere you do. Git and Mercurial seem to work best if you use some kind of "cloud" hosting like Github or BitBucket, but they by no means require it. Depending on your work environment and team size, you may prefer Subversion or PerForce or something else, but that's up to you and your team.
it will help, you will quite likely need to debug on many platforms... Qt Creator, Netbeans and Eclipse come to mind.
Yes. cmake, or qmake for Qt maybe
Not technical question. Just use version control! github and gitorious are easy choices for open source project though.
Qt is a no-brainer choice for cross-platform C++ GUI app, and also decent choice for network app with no GUI.

xcode's executable product for c++ project

I've completed a simple numbers-version of the game "Towers of Hanoi" using xcode's command line tool in C++. Being accustomed to PC compilers like borland's and visual-c, I've attempted to "share" the game with others via e-mail by simply attaching the executable product built by xcode. However, the recipients can't run the program as it shows up in a different format - usually machine code, it sounds like.
After a bit of extensive searching, I'm realizing the complexity of building projects within xcode's IDE and the variations on the build settings/ targets, etc.
Anyone know how to build a self-contained c++ executable to be universally run? I don't go outside the STL library for this game. I'd greatly appreciate any help.
thanks
OS X is based on Unix, which uses plain binary files (i.e. no filename extension) as executables. If they have a certain "executable permission," they can be double-clicked to be run as executables, or run from the command line. However, this permission can't be sent over email - it's metadata within the file system itself, and this makes sense from a security standpoint (you wouldn't want spammers sending you executable viruses over email right?). So when the recipient receives the binary, they'll need to run the following command line command on it, assuming "hanoi" is the name of the binary file:
chmod +x /path/to/hanoi
If you really want to package it as an instantly double-clickable application, you'll need to give it a native UI and package it as a .app, then put that .app (which is actually a folder with the .app extension) in an archive to distribute. Sorry if that's more work than you were hoping for. Hope this helps!
Sharing applications across dot releases of the same OS can be notoriously difficult on the Mac (at least, as far as personal experience goes).
In order to be able to share your application with the least amount of effort, you will need to figure out:
What project type is this? Are you using any resources like images etc?
What version of the OS your friends are using? If they are not on the Mac, you're out of luck (or you'll have to recompile for their OS-es).
If they run Mac, check out that you have the same OS versions, if you have developed on Leopard and someone's running on SnowLeopard your application might simply fail. (I also ran into issues between Mac OS 10.5.4 and 10.5.3 so keep your fingers crossed.)
Check out what sort of hardware you are running. Are you building for your hardware (say, MacIntel) only or are you creating an Universal Binary?
Make sure that all resources are packaged into your application bundle. Make sure your application uses only relative paths.
Check if you are not writing to special folders (i.e. use only temp and/or word-writable locations, if you need to).
I wish I could give a more detailed/to the point reply but unfortunately you'll have to figure out some of the answers yourself (without any other specific information about the error you are getting).
If you're satisfied with a command line tool rather than a double-clickable app, it should suffice to zip it and attach that to the e-mail. Be sure to build universal if anyone you're sending to might be using a PowerPC-based Mac. Oh, and set the deployment target to the minimum OS that any recipient might be using.

Port Visual Studio C++ to Linux

We have a not very complicated but big (i.e. lots of files) Visual Studio C++ Win32 Console written in C++0x standard in VS2010.
It does not use any non standard code or anything (Hopefully!).
I now wanna port it to Linux.
Which way is the quickest way to do it?
autoconf?
old-fashioned make file?
any other solution?
I would use regular make but keep it simple with default rules as much as possible. Add in dependencies as you go along.
EDIT: As in interim step, build it with mingw so that you can avoid the whole API porting issue until you have a working build in your new build mechanism.
If your console app calls win32 API functions then you have a choice between modifying all the source where it is used or writing a module that implements those functions.
In prior porting efforts of this type I tried it both ways and the latter was easier. I ended up writing only about 18 to 20 shim functions.
It was successful enough that I ended up writing an OS abstraction layer that was used on many projects that simply let me compile on Windows native, cygwin, Linux, VxWorks, etc. with trivial changes to one or two files.
(p.s. Any interest in an open source version of a C++ based OS abstraction layer? I was thinking of releasing an unencumbered version of it to the world if there's sufficient interest. It's mostly useful where BOOST is too heavy -- i.e. embedded projects.)
Most probably you don't need autoconf (and I suggest you don't touch it, unless you love pain), because you are not trying to be portable to a dozen of Unix flavours.
Roll your Makefiles manually. It shouldn't be too difficult if you have a set of shared rules and have minimal Makefiles that just specify source files and compile options.
Use GCC 4.5 as it supports more C++0x features.
You can export a make file from Visual Studio.
Update: Actually you can't anymore, unless you have VC6 lying around
STAY AWAY FROM AUTO* and configure. These are horrible IMHO.
If you can somehow get a VS 8 or 9 vcproj/sln, you can use this. I have not used it, so I can't give any advice.
If you're up to manual conversion, I would suggest something like CMake, as it's pretty easy to get ready fast enough, even for large projects.
If the project has a simple layout, you could have success using Qt 4's qmake like this:
qmake -project
It will output a qmake .pro file, which can be converted into a makefile on many platforms (using qmake). This should work okay, but not perfectly. Alternatively, you can install the Qt plugin for VS, and let it generate the pro file from an existing VS project. It will make your build system depend on Qt4's qmake, which is probably not what you want.
There are of course other things like cmake, but they will all require manual intervention.
The fastest way to do it?
g++ *.cpp -o myapp
Seriously, depending on your needs, even generating a makefile might be overkill. If you're just interested in a quick and dirty "let's see if we can get a working program on Linux", just throw your code files at g++ and see what happens.

Simultaneous C++ development on Linux and Windows

We have a handful of developers working on a non-commercial (read: just for fun)
cross-platform C++ project. We've already identified all the cross-platform libraries we'll need. However, some of our developers prefer to use Microsoft Visual C++ 2008, others prefer to code in Emacs on GNU/Linux. We're wondering if it is possible for all of us to work more or less simultaneously out of both environments, from the same code repository. Ultimately we want the project to compile cleanly on both platforms from the start.
Any of our developers are happily willing to switch over to the other environment if this is not possible. We all use both Linux and Windows on a regular basis and enjoy both, so this isn't a question of trying to educate one set devs about the virtues of the other platform. This is about each of us being able to develop in the environment we enjoy most yet still collaborate on a fun project.
Any suggestions or experiences to share?
Use CMake to manage your build files.
This will let you setup a single repository, with one set of text files in it. Each dev can then run the appropriate cmake scripts to build the correct build environment for their system (Visual Studio 2008/2005/GNU C++ build scripts/etc).
There are many advantages here:
Each dev can use their own build environment
Dependencies can be handled very cleanly, including platform specific deps.
Builds can be out of source, which helps prevent accidentally committing inappropriate files
Easy migration to new dev. environments (ie: when VS 2010 is released, some devs can migrate there just by rebuilding their build folder)
I've done it and seen it done without too many problems.
You'll want to try an isolate the code that is different for the different platforms. Also, you'll want to think about your directory structure. Something like
project/src <- .cc and .h files
project/src/linux|win <- code that is specific to one platform or the other
project/linux <- make files and other project related stuff
project/win <- .sln and .csproj files
Basically you just want to be really clear what is specific to each system and what is common.
Also, unit tests are going to be really important since there may be minor difference and you want to make it easy for the windows guys to run some tests to make sure the linux code works as expected and the other way around.
as mentioned in previous posts, Qt is a very easy method to do real simultaneous multi-platform development - independent of your IDE and with many different compilers (even for Symbian, ARM, Windows CE/Mobile...).
In my last and current company I work in mixed Linux- and Windows-developer teams, who work together using Subversion and Qt (which has a very easy and powerful build-system "QMake", which hides all the different platform/compiler specific build environments - you just have to write one build file for all platforms - so easy!).
And: Qt contains nearly everything you need:
simple string handling, with easy translation support and easy string conversion (utf8, utf16, asci...)
simple classes for file i/o, images, networking etc.
comfortable gui classes
graphical designer for the gui layout/design
graphical translation tool to create dynamic translations for your apps (you can run one binary with different selectable languages - even cyrillic, asian fonts...)
integrated testing framework (unit tests)
So Qt is a full featured and very reliable environment - I use it for 10 years.
And: Qt integrates seamlessly into IDEs like VC++, Eclipse or provides its own IDE "QtCreator".
see: http://www.trolltech.com + http://doc.trolltech.com
Best Regards,
Chris
I had such experience working in Acronis. We had the same codebase used to build binaries (fully packaged installers, actually) targetting Win32/64, Linux, and OS X (and a bunch of other more exotic platforms, such as EFI), with everyone working in their IDE of choice. The trick here is to avoid compiler- and IDE-specific solutions, and make your project fully buildable from clean cross-platform make files. Note that you can perfectly well use any make system together with VC++, so it isn't a problem (we used Watcom make for historical reasons, but I wouldn't recommend it).
One other trick you can do is add a make script that automatically produces project files from the input lists in your makefiles, for all IDEs you use (e.g. VS and Eclipse CDT). That way, every developer generates that stuff for himself, and then opens those projects in IDE to edit/build/debug, but the source repository only has makefiles in it.
Ensuring that code is compilable for everyone can be a problem, mostly because VC++ is generally more lax in applying the rules than g++ (for example, it will let you bind an rvalue to a non-const reference, albeit with a warning). If you compile with treat warnings as errors, and highest warning levels (with perhaps a few hand-picked warnings disabled), you will mostly avoid this. Having contiguous rolling build set up is another way to catch those early. One other approach we've used in Acronis is to have the Windows build environment have Cygwin-based cross-compilation tools in it, so any Windows dev could do a build targeting Linux (with the same g++ version and all) from his box, and see if that fails, to verify that his changes will compile in g++.
I personally use cmake/mingw32/Qt4 for all my C++ needs.
QtCreator is a crossplatform IDE which is somewhat optimized for qt4 programming.
We're working on a cross-platform project as well. We're using Emacs to code, SCons to build and Visual Studio 2008 to debug. It's my 1st time using Emacs + SCons and I must say that It's very very nifty once you figure out how does the SConstruct and SConscripts work.
I'm late to this question, and there are a lot of good answers here, but I haven't seen anyone enumerate all the issues I've run into (most of my work in C++ is and has been cross-platform), so:
Cross-platform compilation. Everyone has covered this quite well. CMake, and SCons are useful among others.
Platform differences. These aren't actually limited to Linux v Windows. Different versions of Windows have plenty of subtle issues. If you ever want to jump from Linux to OS X you'll find the same. So do processor architecture differences: 32 v 64 bit doesn't usually matter - until it does and things break on you very badly. This is something C++ programmers face more than in most other languages, whose implementations are working hard to hide this sort of thing from programmers.
Test everything. Alan mentions unit tests but let me emphasize them. The real problem with cross-platform programming is not code that won't compile: it's code that compiles just fine and does the wrong thing in subtle cases. Unit tests matter here much more than they do when you are working on a single platform.
Use portable libraries whenever possible. Getting this right is full of subtle gotchas. It's better to take advantage of someone else's work than to roll your own. Qt is useful here, as are NSPR and APR (the latter two are C, but wrappers exist if you don't want to mess with them directly.) These are the libraries that explicitly target platform abstraction as a goal, but most other libraries you use should be checked for this. Be reasonably wary of cool libraries that don't have a track record of portability. I'm not saying don't use them: just test first. Doing this right saves you enormous effort on testing.
Pavel mentions continual integration. This may not matter if there are only a handful of you. But I've found that the number of platforms you get when you consider all of the OS, OS variant, and processor differences means that without some form of continuous build-test cycle you will always be missing some edge case. If your project gets bigger than a handful of people consider doing this.
Version control. The most obvious issue is handling newlines and filename case-sensitivity but there are others. Most of the well-known open source VCSes do this right by now, but it's notable that it usually takes them several years to find all their bugs related to portability. As an example Mercurial, which has had portability as one of its selling points, has a series of fixes spanning three or four releases dealing with Windows filenames in uncommon situations. Make sure you can check out what the others check in early on.
Scripts. Use Perl/Python/Ruby (or something like them) for your scripting. Trying to keep Linux shell and Windows batch scripts in sync is painfully tedious.
Despite all of the above: overall cross-platform is quite doable, and there's no real reason to avoid it. My experience is that the problems tend to crop up sporadically, but when they do they take more than there fair share of time. If you've been programming a while though you won't find that unusual.
The issue is not really editing the C++ source files, but the build process itself. VS doesn't use the same Makefile architecture as Linux development typically does. A radical suggestion, but both groups could actually use the same C++ IDE and build process - see Code::Blocks for more info.
This is possible (I do that to earn my daily money :-) ).
But you have to keep in mind the differences in the OS supplied libraries which could be very annoying.
Two good options to get around that are BOOST and QT.
Both supply you with platform independent functions of useful stuff that isn't handled by the C lib and the STL.
I've done this in two ways:
Each platform has its own build system. Whenever someone changes something (adding a source file), they either adapt the other platform too, or they mail a notification and someone more familiar with the other platform adapts it accordingly.
Use CMake.
I can't say I really liked CMake, but a cross-platform tool certainly has its advantages.
Generally, using different compilers to compile your code from the very first few lines is always a good idea, as it helps improving code quality.
Another vote for cmake.
One thing to watch out for, filenames are cased but not case sensitive on Windows. They are case sensitive on most unix filesystems.
This is generally a problem with #include
eg. given a file FooBar.h
#include "foobar.h" // works on Windows, fails on Linux.
I do agree with everyone here that has suggested cmake for cross platform development in C++. It is an excellent tool.
Another thing I would suggest is to use eclipse CDT as development environment. It works in any place where you can run Java an gcc and that unifies your development environment.
I think that was Alan Jackson who gave emphasis to the unit test. For doing so yo would need some cross platform unit test libraries. I read this post time ago regarding C++ unit test frameworks. It is a bit outdated but thoughtful. The one missing that also works in both platforms is googletest.
Finally if you want to run those test automatically in both platforms cmake has another tool called ctest that is very good for doing so.
Check out premake... It is pretty similar to CMake but written using lua.
I have used this on a number of development projects and find it easy to learn and integrate into existing company and project structures.
Give it a try!
http://industriousone.com/premake