Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ghidra crash when analyze big mach-O binaries on debian (100 mb + files ?) #6536

Closed
slayy2357 opened this issue May 18, 2024 · 3 comments
Closed

Comments

@slayy2357
Copy link

slayy2357 commented May 18, 2024

Describe the bug
when I run an analysis of a mach-O binary which is more than 100 mb on debian (same result on windows), ghidra crashes around the middle of the analysis (I tested with different mach-O binaries, I think it's because of their size because "small" binaries work well) You can see the log file :
logs.txt

line 776 of log file : "./launch.sh: line 217: 15912 Killed"

To Reproduce
Steps to reproduce the behavior:

  1. Add mach-O binary to project
  2. Analyze it
  3. Ghidra crash (only for me maybe idk)

Expected behavior
Complete analyze of the binary without crash

Environment (please complete the following information):

  • OS: [Debian & Windows]
  • Java Version: [17.0.10]
  • Ghidra Version: [11.0.4]
  • Ghidra Origin: [Official GitHub distro]
@seekbytes
Copy link

From logs it seems it’s being killed. Are you sure you haven’t hit the memory limits and then the OOM kills Ghidra?

@slayy2357
Copy link
Author

Ok, how can I calculate the memory needed to analyze 100mb mach-O binary ?

@dev747368
Copy link
Collaborator

Ok, how can I calculate the memory needed to analyze 100mb mach-O binary ?

It will depend on the specific analyzers that are involved in processing your binary.

You can open a memory usage window in Ghidra from the project window's Help menu. For recent ghidra versions it will be Help|Runtime Info|Memory tab. That will refresh every few seconds, so just leave it up in the background and if the used memory hits the maxmemory value and then you get a crash, its probably memory.

You can also disable individual analyzers before kicking off the analysis session. I'm not familiar with all analyzers, but for example, the DWARF analyzer with extremely large binaries can need a bit of memory.

The name of the currently running analyzer will typically show in the bottom right corner of the tool window where there is a progress bar. (sometimes... sometimes it might not be helpful)

@NationalSecurityAgency NationalSecurityAgency locked and limited conversation to collaborators May 21, 2024
@ryanmkurtz ryanmkurtz converted this issue into discussion #6550 May 21, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants