Pusing model to hugging face

3]

0s

Assuming the previous cell with google.colab.files.upload() has run successfully

and the ‘uploaded’ dictionary contains your model file data.

Get the filename from the uploaded dictionary. Assuming only one file was uploaded.

if uploaded:

uploaded_filename = list(uploaded.keys())[0]

print(f"Uploaded file: {uploaded_filename}")

else:

This should ideally not happen if the previous cell completed without error

raise FileNotFoundError(“No file was uploaded. Please upload your trained YOLOv5 model.”)

Construct the full path to the uploaded file in /content/

Colab uploads files to the /content/ directory by default.

uploaded_filepath = f"/content/{uploaded_filename}"

Verify that the uploaded file exists at the expected path and is not just a placeholder

You might want to add a check for file size or content type if possible

import os

if not os.path.exists(uploaded_filepath):

If the file isn’t at the expected path, list files in /content/ to debug

print(f"Error: Uploaded file not found at {uploaded_filepath}. Listing files in /content/:")

!ls -lh /content/

raise FileNotFoundError(f"Uploaded file not found at {uploaded_filepath}. Please check the upload process.")

else:

Optional: Print file size to confirm it’s not just a small placeholder

print(f"Found uploaded file at: {uploaded_filepath} (Size: {os.path.getsize(uploaded_filepath)} bytes)")

%%

Now, load the YOLO model using the correct filepath

Make sure you are in the directory where ultralytics is installed or import YOLO correctly.

Assuming you have already cloned and installed ultralytics and are in the correct directory,

e.g., if you cloned to /content/ultralytics and did %cd /content/ultralytics

If you are running this cell after cloning and changing directory to /content/ultralytics

and you want to load a model from /content/, use the full path.

Assuming you want to load a YOLOv8 model (as per the original code attempt)

from ultralytics import YOLO

from huggingface_hub import HfApi, upload_file # Import necessary modules for upload

import _pickle # Import _pickle to catch the specific error

Assuming MODEL_NAME is defined in a previous cell

MODEL_NAME = “Yolov8n-onnx-export”

Define Hugging Face repo details

api = HfApi()

hf_username = api.whoami()[“name”]

repo_id = f"{hf_username}/{MODEL_NAME}"

Attempt to load and export the model

print(f"Attempting to load model from: {uploaded_filepath}")

try:

model = YOLO(uploaded_filepath)

print(“Model loaded successfully.”)

Export the model to ONNX format

model.export() returns the path to the exported file

results = model.export(format=‘onnx’)

print(“Model exported to ONNX successfully.”)

print(f"Exported file: {results}")

Assuming the export was successful and ‘results’ contains the ONNX file path

onnx_filepath_after_export = results

Now use onnx_filepath_after_export for uploading to Hugging Face

print(f"Attempting to upload ONNX file: {onnx_filepath_after_export}")

if os.path.exists(onnx_filepath_after_export):

upload_file(

path_or_fileobj=onnx_filepath_after_export,

path_in_repo=f"{MODEL_NAME}.onnx", # Upload with the model name as filename in repo

repo_id=repo_id,

repo_type=“model”

)

print(f"Successfully uploaded {onnx_filepath_after_export} to {repo_id}/{MODEL_NAME}.onnx")

else:

This else block should ideally not be reached if export was successful

print(f"Error: ONNX file not found after export at {onnx_filepath_after_export}.")

print(“Please check the export process.”)

Catch the specific unpickling error

except _pickle.UnpicklingError as e:

print(f"Error loading model: {e}")

print(“-” * 50)

print(“This error usually means the uploaded model file is corrupted or not a valid PyTorch model (.pt).”)

print(“Please ensure you uploaded the correct, uncorrupted .pt file.”)

print(“You may need to re-download the model from its source or re-upload it.”)

print(“-” * 50)

import traceback # Import traceback to print detailed error info

traceback.print_exc() # Print the full traceback

Catch any other exceptions during loading or exporting

except Exception as e:

print(f"An unexpected error occurred during model loading or exporting: {e}")

import traceback # Import traceback to print detailed error info

traceback.print_exc() # Print the full traceback

%%

Continue with the rest of your notebook if any cells follow

Uploaded file: Yolov8n train.pt Found uploaded file at: /content/Yolov8n train.pt (Size: 844 bytes) Attempting to load model from: /content/Yolov8n train.pt Error loading model: unpickling stack underflow -------------------------------------------------- This error usually means the uploaded model file is corrupted or not a valid PyTorch model (.pt). Please ensure you uploaded the correct, uncorrupted .pt file. You may need to re-download the model from its source or re-upload it. --------------------------------------------------

Traceback (most recent call last): File “”, line 55, in <cell line: 0> model = YOLO(uploaded_filepath) ^^^^^^^^^^^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/ultralytics/models/yolo/model.py”, line 79, in init super().init(model=model, task=task, verbose=verbose) File “/usr/local/lib/python3.11/dist-packages/ultralytics/engine/model.py”, line 151, in init self._load(model, task=task) File “/usr/local/lib/python3.11/dist-packages/ultralytics/engine/model.py”, line 295, in _load self.model, self.ckpt = attempt_load_one_weight(weights) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/ultralytics/nn/tasks.py”, line 1540, in attempt_load_one_weight ckpt, weight = torch_safe_load(weight) # load ckpt ^^^^^^^^^^^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/ultralytics/nn/tasks.py”, line 1444, in torch_safe_load ckpt = torch.load(file, map_location=“cpu”) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/ultralytics/utils/patches.py”, line 117, in torch_load return _torch_load(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/torch/serialization.py”, line 1495, in load return _legacy_load( ^^^^^^^^^^^^^ File “/usr/local/lib/python3.11/dist-packages/torch/serialization.py”, line 1744, in _legacy_load magic_number = pickle_module.load(f, **pickle_load_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _pickle.UnpicklingError: unpickling stack underflow

It seems you just posted a notebook so could you describe what your question is?