Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault when run GEMMA-2B model #5824

Open
mvhsin opened this issue Jan 15, 2025 · 2 comments
Open

Segmentation fault when run GEMMA-2B model #5824

mvhsin opened this issue Jan 15, 2025 · 2 comments
Assignees
Labels
platform:android Issues with Android as Platform platform:c++ Issues specific to C++ framework in mediapipe stale stat:awaiting response Waiting for user response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:bug Bug in the Source Code of MediaPipe Solution

Comments

@mvhsin
Copy link

mvhsin commented Jan 15, 2025

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

Android 12

Mobile device if the issue happens on mobile device

Orange Pi 5

Browser and version if the issue happens on browser

No response

Programming Language and version

C++

MediaPipe version

No response

Bazel version

6.5

Solution

llm_inference_engine_cpu_main

Android Studio, NDK, SDK versions (if issue is related to building in Android environment)

No response

Xcode & Tulsi version (if issue is related to building for iOS)

No response

Describe the actual behavior

build llm_inference_engine_cpu_main and run by ./llm_inference_engine_cpu_main --model_path <path_to_gemma-2b-it-cpu-int4.bin>, then I got seg fault

Describe the expected behaviour

Able to run llm_inference_engine_cpu_main and load GEMMA bin file without error

Standalone code/steps you may have used to try to get what you need

The root cause is that the condition and function call mismatch here: https://github.com/google-ai-edge/mediapipe/blob/master/mediapipe/tasks/cc/genai/inference/c/llm_inference_engine_cpu.cc#L507 Please help fix it. Thanks!

Other info / Complete Logs

No response

@mvhsin mvhsin added the type:bug Bug in the Source Code of MediaPipe Solution label Jan 15, 2025
@kuaashish kuaashish assigned kuaashish and unassigned kalyan2789g Jan 15, 2025
@kuaashish kuaashish added platform:android Issues with Android as Platform platform:c++ Issues specific to C++ framework in mediapipe task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup labels Jan 15, 2025
@kuaashish
Copy link
Collaborator

Hi @mvhsin,

Could you please provide the complete steps you are following from our documentation, or share the command you are using along with the full error log? This will help us better understand the issue and reproduce it on our end.

Thank you!!

@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Jan 16, 2025
Copy link

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale label Jan 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:android Issues with Android as Platform platform:c++ Issues specific to C++ framework in mediapipe stale stat:awaiting response Waiting for user response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:bug Bug in the Source Code of MediaPipe Solution
Projects
None yet
Development

No branches or pull requests

3 participants