Motion Analysis
For Every Athlete.

No labs. No sensors. No wait.
Designed to be Fast, Accurate, and Accessible.

From Video
to Kinematics.

Four steps. One smartphone. Complete motion analysis.

Set Up the Shot
01

Set Up the Shot

Position your smartphone on a tripod or hand it to a coach or teammate. No special setup required — just your phone to capture the motion.

Record the Run
02

Record the Run

Film in landscape modethe athlete in motion. Kinepose captures high-resolution footage frame-by-frame, ready for biomechanical analysis.

Upload the Video
03

Upload the Video

Open the Kinepose app and upload your video up to 10s long. The app will automatically process the footage in less than a minute.

Analyze the Motion
04

Analyze the Motion

Our AI pipeline tracks keypoints, computes joint angles, detects gait events, and overlays a real-time motion skeleton on the video.

Read the Insights
05

Read the Insights

Get clear, actionable biomechanical indicators — angles, ground contact time, air time, and more — without any lab equipment.

Built on Real
Sports Science.

107° 165° 82°
185 spm Cadence
94% Symmetry

Computer Vision at the Core.

Our AI pipeline detects and tracks 17+ anatomical keypoints frame-by-frame, computing precise joint angles and biomechanical vectors in real time — directly from standard smartphone video, no markers or sensors needed.

17+ Tracked keypoints

Frame-Perfect Gait Analysis.

Kinepose automatically detects critical gait events — foot strike, toe-off, mid-stance, and flight phase — enabling measurement of cadence, ground contact time, stride length, and asymmetry indices with lab-level precision.

<33ms Per-frame analysis

See the Invisible.

Athletes and coaches get a real-time skeleton overlay on the recorded video, with colour-coded joint angle indicators. Review any frame, zoom in, and compare sessions side by side without any post-production.

60fps Smooth playback

Lab Quality for Every Athlete.

Traditional biomechanics labs cost thousands. Kinepose brings the same analytical power to any coach or athlete with a smartphone — making elite-level insight accessible at every level of the sport.

€0 Extra hardware

Motion in
Pure Data.

17 keypoints. Every stride. Real athlete data, rendered live.

AI Skeleton
Original Video

Video source: Noah Lyles 100m — YouTube

See the Engine
in Action.

Try our live demo to see the engine in action frame by frame. Every keypoint, every angle — computed live from our pose estimation pipeline.

Drag to analyze frame by frame

Joint Angles

Right Elbow
—°
Left Elbow
—°
Right Knee
—°
Left Knee
—°
1 / 124 0.00s
1 124

This live demo is for demonstration purposes only. In-app overlay graphics may differ and will have smoother frame transitions.

Video source: Noah Lyles 100m — YouTube

Awarded for
Sports Innovation

1st Place — Samsung Innovation Campus 2025

Awarded as the most Innovative Sports Technology Project in 2025

Awarded first place at Samsung Innovation Campus 2025 in the "Sports & Technology" category, Kinepose — born as "Track Analyzer" — was recognised by the University of Pisa as the most innovative project for combining AI with sports science, revolutionising athlete performance analysis and training methodologies.

Read the Article
🏆
Watch Demo

See KinePose in Action

Watch the full demo video showcasing the AI motion overlay, frame-by-frame analysis, and real-track biomechanics pipeline.

Watch on YouTube

Recognised by

Athletes.
Engineers.
Innovators.

Lorenzo Galli is a sports technology developer and the creator of Kinepose. With a background in computer science and a career in athletics, Lorenzo has always been fascinated by the intersection of sports and technology. Kinepose was born from the author's original idea of addressing a simple frustration: high-level biomechanical analysis was locked behind expensive lab equipment and specialist access.

Kinepose originally started as "Track Analyzer", a project led by Lorenzo Galli and developed during Samsung Innovation Campus 2025 in cooperation with Fabio Gulmini, Martina Capobianco and Alessio Musto. Building on that foundation, Kinepose further advances this vision through computer vision and AI, with special regard for the sport of athletics. The main goal is to give coaches and athletes high-quality data without the need of lab environments. As today, Kinepose remains a beta software, and Lorenzo Galli offers consultancy services for teams and athletes interested in using the technology for performance analysis and training optimisation.

1st Samsung Innovation
Campus 2025
AI Powered Motion
Pipeline
0 Extra hardware
needed

Mission

Make pro biomechanics accessible to every athlete

Technology

AI-powered pose estimation & gait event detection

Where

On the track, indoors — anywhere you train

Common
Questions.

Everything you need to know before getting started.

What video format does Kinepose support?

Any standard smartphone video format works — MP4, MOV, HEVC. We recommend filming in landscape mode at the highest available resolution and frame rate (60fps ideal) for best results.

What is the maximum video length?

Currently, the maximum video length is fixed at 5 seconds. This is enough to cover sprint starts, flying sprints, jumps and throws. The limit exists to keep processing times reasonable during the beta phase, but we are already working to at least double the maximum duration in a future update.

How long does the analysis take?

Since the pipeline runs across three different AI layers, processing times vary depending on the video framerate. A 5-second clip at 30fps typically takes around 30 seconds, while the same clip at 60fps takes around 1 minute — processing approximately 5 frames per second. The entire pipeline runs in the cloud, so your device stays free throughout.

How accurate is the biomechanical data?

Our model tracks 17 anatomical keypoints with sub-pixel accuracy on standard HD video, producing joint angle measurements within ±2–3° of traditional motion capture systems. Our 3-level AI pipeline is specifically designed to perform reliably even under non-optimal conditions — such as poor lighting, dark clothing, or a camera angle that is not perfectly orthogonal to the subject.

How do I get the best results?

While our pipeline handles difficult conditions well, we recommend positioning the camera orthogonally to the subject, ensuring good lighting, and making sure the athlete is the only person in the frame — no other athletes, coaches or bystanders. This gives the AI the cleanest possible input and ensures the most accurate analysis.

Which running disciplines are supported?

Kinepose is optimised for sprints (60m–400m), but works well for any linear running motion — middle distance, hurdles approach, jumps and throws. Extended support for additional disciplines is on our roadmap.

Do I need special equipment or markers?

No. All you need is a smartphone. Kinepose uses markerless pose estimation — no reflective dots, no suits, no lab hardware. Just point and shoot.

Is my athletes' data private?

Yes. Uploaded videos are processed in the cloud by our AI pipeline and automatically deleted as soon as the analysis is complete — meaning the developers have no access to your footage. Only the resulting biomechanical data is stored and associated with your account, fully encrypted in transit and at rest.

When will Kinepose be available publicly?

Current development is focused on continuously improving the professional software. While a future public app launch may be considered down the line, the service currently offered is on-field consultancy by Lorenzo Galli for teams and athletes interested in using Kinepose for performance analysis and training optimisation. Get in touch via the contact form below.

Ready to Analyse
Your Performance?

Whether you're a coach, athlete, sports federation, or researcher — we'd love to hear from you. Request a demo or reach out directly.

We'll get back to you within 48 hours.