The core dump files generated when my app crashes seem systematically short of about 2692 bytes, when loading them in gdb I get :
BFD: Warning: [...]core is truncated: expected core file size >= 117628928, found: 117626236.
Sizes vary a bit but the missing part (difference between the expected size and what's found) is always between 2690 and 2693 bytes.
Before starting the app I force :
ulimit -c unlimited
There's plenty of space and time for the system to write the file.
Other details that may be relevant :
- The target is an APF27.
- It runs Linux kernel 2.6.38
- The core file is generated on a SD card.
- The size of the file matches the size found by gdb.
Any hint will be appreciated.
Related
I’m using Teensy 3.2 and cannot build my teensy code due to two warnings resulting in an error 1 return.
Warning 1 - .pio/build/teensy31/firmware.elf section .text' will not fit in region FLASH’
Warning 2 - region `FLASH’ overflowed by 86948 bytes
Error - collect2: error: ld returned 1 exit status
From what I read it basically means that the file is too large but my src folder is 40129 bytes and Teensy 3.2 flash size is 262144 bytes as it is written in the platforms/teensy/boards/teensy31.json file.
Even the build begins with >
Verbose mode can be enabled via -v, --verbose option
CONFIGURATION: https://docs.platformio.org/page/boards/teensy/teensy31.html
PLATFORM: Teensy (4.16.0) > Teensy 3.1 / 3.2
HARDWARE: MK20DX256 72MHz, 64KB RAM, 256KB Flash
DEBUG: Current (jlink) External (jlink)
PACKAGES:
- framework-arduinoteensy # 1.156.0 (1.56)
- toolchain-gccarmnoneeabi # 1.50401.190816 (5.4.1)
The src folder is a cpp file (with setup and loop functions) + 4 header files surrounding it with functions used in the cpp file. Also, the 2 warnings in the .h files are unrelated to the issue.
Tree for more clarity
From what I read it basically means that the file is too large but my
src folder is 40129 bytes and Teensy 3.2 flash size is 262144
The size of your src folder has not much to do with the size of the generated program. If you are interested in where all that memory goes to you can use an ELF viewer.
For example, here you find an online viewer: http://www.sunshine2k.de/coding/javascript/onlineelfviewer/onlineelfviewer.html.
Upload your elf file and scroll down to the symbol table section to find out what eats up that huge amount of memory.
I am working with OpenCV2.4 and SVM classification and I need to load a big dataset (about 400Mb of data) in C++. I've been able to save this dataset under a XML file, but I am unable to load it after that. Indedd, I receive the following message :
OpenCV Error: Insufficient memory (Failed to allocate 408909812 bytes) in OutOfMemoryError, file (my opencv2.4 directory)modules\core\src\alloc.cpp, line 52 - error: (-4)
How could I increase the available memory (I have plenty of free RAM) ?
Thanks a lot !
EDIT :
Here is the place where the problem appears. The code works when I load a smaller file
std::cout<<"ok 0"<<std::endl;
FileStorage XML_Data(Filename, FileStorage::READ);
XML_Data["Data"]>>m_Data_Matrix;
XML_Data.release();
std::cout<<"ok 1"<<std::endl;
EDIT 2 :
Problem solved : the solution was to compile my application and OpenCV2.4.5 as a 64 bit application. I've installed a 64 bit version of MinGW, build OpenCV with this new version (and using cmake to configure) and then modified the compiler used by codeblocks.
You could find these links usefull : http://forums.codeblocks.org/index.php?topic=13016.0 and http://www.drangon.org/mingw.
reWhen analyzing a core file, my gdb 7.0 outputs several warnings:
warning: Wrong size gregset in core file.
warning: Wrong size fpregset in core file.
warning: Wrong size gregset in core file.
warning: Wrong size fpregset in core file.
warning: Unable to find dynamic linker breakpoint function.
GDB will be unable to debug shared library initializers
and track explicitly loaded dynamic code.
I am not sure if its related, but I am unable to get a backtrace:
(gdb) bt
#0 0x00000000 in ?? ()
OS architecture is SUN Solaris 10 SPARC.
Questions:
What is the reason/cause of these warnings?
Why can't I retrieve a backtrace?
How to fix these problems?
The problem can in gdb as well in your program.
I would recommend to update gdb to the most recent version (7.3.1). Also it could be helpful to create simple test program and analyze its core with gdb to be sure that your utility works fine.
"gregset" and other error indicate that gdb unable read the data from the core file. It can happen if your program gone wild and corrupt stack. gregset error means that gdb was unable to read general-purpose register set from a core file. fpregset is for floating-point register set. The expected register size is platform dependent.
bt would not work if you cant read core file properly.
I also had the fpregset warnings (and no stack trace) when I tried to work on a 64bit core dump with gdb 7.6.2 on Solaris 10. The cause seems to be, that the userspace applications of Solaris 10 are compiled with 32bits by default - and without support for 64bit core cumps.
The guys in GDB's IRC channel gave me the following parameter:
--enable-64-bit-bfd
I also compiled a 64bit version of gdb (-m64), but that shouldn't be necessary. Now gdb could work on the 64bit core dump and create the stack trace without any warnings.
I'm using the latest version of dbxtool (Solaris Studio ) on RHEL6.1.
I'm working through the tutorial example here using their example code, but when trying to run dbxtool on the core file generated, I get the following:
(dbx) cd /users/rory/Desktop/debug_tutorial
(dbx) debug /users/rory/Desktop/debug_tutorial/a.out core.a.out.10665
Reading a.out
dbx: warning: The corefile was truncated.
It should have been 1765376 bytes long (is only 483328)
Because of this, some functionality will be missing from dbx.
(See `help core')
core file header read successfully
Reading ld-linux-x86-64.so.2
Reading libstdc++.so.6
Reading libm.so.6
Reading libgcc_s.so.1
Reading libc.so.6
program terminated by signal SEGV (Segmentation fault)
dbx: core file read error: address 0x3faff579bc not available
dbx: attempt to fetch registers failed - stack corrupted
The first warning is about the core file being truncated (should have been 1765376 bytes long (is only 483328)), but I am able to generate other core files in the same directory with a larger size, so not sure why this one is being truncated?
I've also gone through the tutorial here on removing core size file limits, but with no luck.
This is a known dbx problem on RH6 (CR 7077948). The core file size is miscalculated if a data segment has a memory size larger than the file size (p_filesz) in the elf header. This problem has been identified and fixed in dbx 7.9.
I'm using an Olimex ARM-USB-OCD dongle with openOCD and GDB to program and debug an stm32f103 micro. The IDE I'm using came from the Olimex dev-kit CD and makes use of eclipse ganymede.
I can load a small program into the RAM and step through the code without any problems.
I now have a much larger program which doesn't fit into RAM (which is only 20K) and so I'd like to run it from flash (which is 128K).
I've modified the linker script indicating the program code should go in the flash section (address 0x8000000), but gdb fails to load the program.
(gdb)
20 load main.out
&"load main.out\n"
load main.out
~"Loading section .text, size 0xb0e6 lma 0x8000000\n"
Loading section .text, size 0xb0e6 lma 0x8000000
&"Load failed\n"
Load failed
What should I do to get gdb to load the program into flash?
Have you considered flashing directly with openocd? I am doing this in a similar setup, but with an ARM7 microcontroller.
openocd -f flash.cfg
Here is my flash.cfg
set CHIPNAME at91sam7x512
source [find interface/olimex-arm-usb-ocd.cfg]
source [find target/at91sam7sx.cfg]
init
halt
flash probe 0
flash probe 1
flash erase_sector 0 0 15
flash erase_sector 1 0 15
flash write_image my-image.elf
at91sam7 gpnvm 0 set
at91sam7 gpnvm 1 set
at91sam7 gpnvm 2 set
shutdown
The GPNVM stuff is Atmel SAM7 specific, but I think this script should give you a good starting point for making a STM32 version. Openocd can be a bit confusing in the beginning, but the documentation is good and worth reading (http://openocd.berlios.de/). The current stable version (0.4.0) is quite old, so if you have problems, download the latest source code and compile your own.