STATION F
DEEP LEARNING COMPUTER VISION INTERN
STATION FFrance7 hours ago
InternshipRemote FriendlyEngineering, Information Technology
About

At Neurobus, we develop intelligent vision systems powered by neuromorphic technologies to enable ultra-energy-efficient embedded intelligence, allowing autonomous systems to perceive, navigate, and respond in real time across drones, defense, and space.

Job Description

Neurobus is developing cutting-edge vision solutions and systems, leveraging neuromorphic technologies to enhance the intelligence and efficiency of embedded devices and robots in the Space and Defense sectors.

We are opening an internship focused on deep learning for computer vision, with an emphasis on neuromorphic-inspired model architectures designed for efficient edge deployment. The core objective is to implement and evaluate a next-generation transformer-style vision model that operates on discrete, relative spike-timing, representations and supports efficient scaling to higher-resolution inputs.

You will contribute to building a research-grade prototype in PyTorch, validating it on standard vision benchmarks, and iterating toward architectures that offer favorable trade-offs between accuracy, latency, memory bandwidth, and compute. Because parts of the approach are not widely supported in existing deep learning libraries, this internship involves implementing key components from scratch, including custom forward/backward passes when necessary.

As a Deep Learning Computer Vision Intern at Neurobus, you will:

  • Implement core building blocks of an efficient, transformer-style vision model in PyTorch, including components that rely on discrete/event-like computations.
  • Establish training and evaluation baselines on standard computer vision tasks (e.g., image classification, object detection), demonstrating stable learning and reproducible results.
  • Extend the architecture with multi-scale or hierarchical processing to improve efficiency and scalability for larger images and higher token counts.
  • Benchmark performance against strong modern baselines, with attention to both model quality and efficiency metrics (runtime, memory, throughput).
  • Investigate positional and spatial representation strategies suited to discrete/event-like processing and assess their effect on training stability and accuracy.
  • Perform systematic ablation studies across key architectural and hyperparameter choices (e.g., depth, width, attention configuration, comparison/lookup mechanisms), and quantify impacts on compute, memory, and accuracy.
  • Explore regularization and robustness techniques tailored to discrete and lookup-based model components, and evaluate their benefits across tasks.
  • Document implementations, experiments, and results, and present progress updates to the team.

Preferred Experience

  • Currently pursuing or recently graduated with a Master’s degree in Machine Learning, Computer Vision, Computer Science, Applied Mathematics or a related field
  • Strong theoretical foundation in machine learning and the mathematical tools used in deep learning
  • Solid hands-on experience with PyTorch and custom model block implementation
  • Ability and willingness to implement novel research code from scratch, including custom ops and gradients where needed
  • Familiarity with transformer architectures and modern computer vision models
  • Strong problem-solving skills, scientific rigor, and comfort with experimental iteration
  • Excellent communication and collaboration skills
  • Fluency in English

Bonus Points

  • Experience with neuromorphic computing, event-based sensing, or spiking/event-driven models
  • Experience implementing custom PyTorch autograd Functions and/or C++/CUDA extensions
  • Familiarity with hierarchical vision architectures and multi-scale feature learning
  • Experience with object detection and evaluation workflows
  • Strong software engineering practices (clean abstractions, testing, reproducibility)

Additional Information

  • Contract Type: Internship (Between 5 and 6 months)
  • Location: Paris
  • Occasional remote authorized

Key Skills

Ranked by relevance