<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://upperbound.space/feed.xml" rel="self" type="application/atom+xml" /><link href="https://upperbound.space/" rel="alternate" type="text/html" /><updated>2026-04-26T18:03:27+00:00</updated><id>https://upperbound.space/feed.xml</id><title type="html">Upper Bound Robotics</title><subtitle>Autonomous systems engineering for aerospace, aerial, ground, and marine platforms — AI, flight control, GNC, and long-range communication solutions.</subtitle><entry><title type="html">Designing Embedded Systems for Space Environments</title><link href="https://upperbound.space/blog/2026/04/01/designing-embedded-systems-for-space/" rel="alternate" type="text/html" title="Designing Embedded Systems for Space Environments" /><published>2026-04-01T00:00:00+00:00</published><updated>2026-04-01T00:00:00+00:00</updated><id>https://upperbound.space/blog/2026/04/01/designing-embedded-systems-for-space</id><content type="html" xml:base="https://upperbound.space/blog/2026/04/01/designing-embedded-systems-for-space/"><![CDATA[<p>Designing electronics for space is a fundamentally different discipline than designing for terrestrial applications. On Earth, we take for granted stable temperatures, atmospheric pressure, magnetic field protection from radiation, and the ability to physically access hardware for repairs. In space, none of these exist.</p>

<p>At Upper, we leverage over a decade of ground-based robotics and embedded systems experience to design computing platforms that survive and operate reliably in the most demanding environment there is.</p>

<h2 id="the-space-environment">The Space Environment</h2>

<p>Before discussing design principles, it’s worth understanding what makes space so hostile to electronics:</p>

<p><strong>Radiation.</strong> Beyond Earth’s magnetosphere, electronics are bombarded by cosmic rays and solar particle events. A single heavy ion can flip a bit in memory (single-event upset), latch up a circuit (single-event latch-up), or permanently damage a transistor (single-event burnout). Over time, total ionizing dose degrades semiconductor performance and eventually causes failure.</p>

<p><strong>Thermal extremes.</strong> In low Earth orbit, a satellite transitions between direct sunlight (+120°C) and Earth’s shadow (-170°C) every 90 minutes. Components must survive thousands of these thermal cycles without cracking solder joints, delaminating PCBs, or failing from thermal fatigue.</p>

<p><strong>Vacuum.</strong> Standard electronics rely on convective cooling — heat rises through air. In vacuum, the only thermal dissipation paths are conduction (through the structure) and radiation (emitting infrared). This fundamentally changes thermal management strategy.</p>

<p><strong>Vibration.</strong> Launch vehicles subject payloads to intense vibration and acoustic loads during ascent. Every solder joint, connector, and component mounting must survive this mechanical environment.</p>

<h2 id="design-principles">Design Principles</h2>

<p>Our approach to space-rated embedded system design follows several core principles:</p>

<h3 id="radiation-mitigation">Radiation Mitigation</h3>

<p>We employ multiple strategies depending on the mission profile and radiation environment:</p>

<ul>
  <li><strong>Component selection</strong> — using radiation-tolerant or radiation-hardened parts for critical functions, with commercial-off-the-shelf (COTS) parts where appropriate with additional mitigation</li>
  <li><strong>Error detection and correction</strong> — ECC memory, TMR (triple modular redundancy) for critical logic, and scrubbing routines for FPGA configurations</li>
  <li><strong>Shielding</strong> — strategic placement of mass around the most sensitive components, balanced against the weight budget</li>
  <li><strong>Software watchdogs</strong> — automated detection and recovery from radiation-induced anomalies</li>
</ul>

<h3 id="thermal-management">Thermal Management</h3>

<p>Without convection, thermal design becomes a conduction and radiation problem:</p>

<ul>
  <li><strong>Thermal interface materials</strong> — ensuring efficient heat transfer from components to the spacecraft structure</li>
  <li><strong>Heat pipes and cold plates</strong> — passive thermal transport for high-power components</li>
  <li><strong>Heater circuits</strong> — keeping components above their minimum operating temperature during eclipse periods</li>
  <li><strong>Thermal modeling</strong> — FEA-based simulation of the full thermal environment before hardware is built</li>
</ul>

<h3 id="power-efficiency">Power Efficiency</h3>

<p>Solar panels generate limited power, and batteries have finite capacity. Every watt matters:</p>

<ul>
  <li><strong>Low-power processor selection</strong> — choosing architectures that deliver maximum computation per watt</li>
  <li><strong>Dynamic power management</strong> — scaling clock frequencies and shutting down unused peripherals based on the current mission phase</li>
  <li><strong>Power budgeting</strong> — allocating and tracking power consumption at the subsystem level throughout the design lifecycle</li>
</ul>

<h2 id="from-ground-to-orbit">From Ground to Orbit</h2>

<p>One of Upper’s key strengths is that our space embedded systems design builds on proven ground-based robotics architectures. Many of the same principles apply — our ground vehicles also operate in extreme temperatures, high vibration, and remote environments where maintenance is impossible. Space takes these constraints to their logical extreme, but the engineering discipline is the same.</p>

<p>This means our space designs benefit from battle-tested ground software, proven communication protocols, and reliable sensor interface patterns — adapted and hardened for the orbital environment.</p>

<h2 id="what-we-deliver">What We Deliver</h2>

<p>Our space embedded systems services cover the full design lifecycle:</p>

<ul>
  <li>Requirements analysis and mission-specific trade studies</li>
  <li>Schematic design and PCB layout for space-rated boards</li>
  <li>Radiation analysis and mitigation strategy</li>
  <li>Thermal modeling and management design</li>
  <li>Firmware and flight software development</li>
  <li>Environmental testing support (thermal vacuum, vibration, radiation)</li>
</ul>

<p>If you’re developing a spacecraft, satellite, or launch vehicle and need reliable embedded computing, <a href="/contact/">contact us</a> to discuss how we can support your mission.</p>]]></content><author><name>Upper Bound Robotics</name></author><category term="aerospace" /><category term="space" /><category term="embedded-systems" /><category term="hardware" /><summary type="html"><![CDATA[Key engineering challenges and design principles for building reliable embedded computing systems that operate in the harsh conditions of space.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://upperbound.space/assets/images/blog/space-embedded-systems.png" /><media:content medium="image" url="https://upperbound.space/assets/images/blog/space-embedded-systems.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">ML-Enhanced Flight Control: Bridging Classical Control and AI</title><link href="https://upperbound.space/blog/2026/03/15/ml-enhanced-flight-control/" rel="alternate" type="text/html" title="ML-Enhanced Flight Control: Bridging Classical Control and AI" /><published>2026-03-15T00:00:00+00:00</published><updated>2026-03-15T00:00:00+00:00</updated><id>https://upperbound.space/blog/2026/03/15/ml-enhanced-flight-control</id><content type="html" xml:base="https://upperbound.space/blog/2026/03/15/ml-enhanced-flight-control/"><![CDATA[<p>Classical flight control theory has served aviation well for decades. PID controllers, LQR, and model predictive control (MPC) provide stable, predictable behavior when the system dynamics are well understood. But autonomous aerial vehicles increasingly operate in conditions where classical approaches alone fall short — unpredictable wind gusts, shifting payloads, degraded actuators, and environments where a mathematical model simply can’t capture every variable.</p>

<p>This is where machine learning enters the picture — not to replace classical control, but to augment it.</p>

<h2 id="the-hybrid-approach">The Hybrid Approach</h2>

<p>At Upper, our flight controller combines classical control algorithms with ML-enhanced components that handle the parts of flight control where traditional methods struggle:</p>

<p><strong>Adaptive disturbance rejection.</strong> Wind is the enemy of stable flight. Classical controllers can compensate for steady winds, but turbulence and gusts introduce rapid, unpredictable forces. Our ML layer learns the vehicle’s response characteristics in real time and adjusts control outputs faster than a traditional feedback loop can react.</p>

<p><strong>Payload adaptation.</strong> When a UAV picks up a payload, drops a delivery, or expends fuel, its mass distribution changes — sometimes dramatically. Our neural network continuously estimates the vehicle’s inertial properties and feeds updated parameters to the classical controller, maintaining stability without requiring manual recalibration.</p>

<p><strong>Actuator degradation.</strong> Motors wear out. Propellers chip. ESCs overheat. Our system detects degraded actuators through learned performance baselines and redistributes control authority across the remaining healthy actuators — enabling continued safe operation or controlled landing.</p>

<h2 id="why-not-pure-ml-control">Why Not Pure ML Control?</h2>

<p>A fair question. End-to-end neural network controllers exist in research, but they lack the guarantees that aerospace applications demand:</p>

<ul>
  <li><strong>Predictability</strong> — Classical controllers have well-understood stability margins. Pure ML controllers are black boxes.</li>
  <li><strong>Certification</strong> — Aviation regulators require provable safety bounds. Classical control theory provides these. ML does not, at least not yet.</li>
  <li><strong>Failure modes</strong> — When a classical controller fails, it fails in understood ways. When an ML controller encounters out-of-distribution inputs, the failure mode is unpredictable.</li>
</ul>

<p>Our hybrid architecture gives us the best of both worlds: the safety and predictability of classical control with the adaptability and learning capability of ML.</p>

<h2 id="running-on-embedded-hardware">Running on Embedded Hardware</h2>

<p>Flight control loops run at 400Hz or higher. That’s 2.5 milliseconds per control cycle — including sensor reads, state estimation, control computation, and actuator output. Our ML components are designed for this constraint:</p>

<ul>
  <li>Lightweight neural networks with microsecond inference times</li>
  <li>Fixed-point arithmetic optimized for the flight controller’s onboard processor</li>
  <li>No cloud dependency — everything runs on the vehicle</li>
</ul>

<h2 id="across-platforms">Across Platforms</h2>

<p>Our ML-enhanced flight controller currently supports:</p>

<ul>
  <li><strong>Multicopters</strong> — advanced attitude estimation and agile maneuvering in confined or GPS-denied environments</li>
  <li><strong>Fixed wing</strong> — energy-optimized cruise control and autonomous waypoint navigation for long-endurance missions</li>
  <li><strong>VTOL</strong> — seamless transition control between hover and forward flight with ML-managed mode switching</li>
</ul>

<h2 id="whats-next">What’s Next</h2>

<p>We’re expanding our ML capabilities to include predictive maintenance — using flight data patterns to forecast actuator failures before they happen, enabling proactive servicing rather than reactive repairs. We’re also investigating reinforcement learning approaches for extreme maneuvering scenarios where classical control theory has no optimal solution.</p>

<p>Interested in our flight controller for your aerial platform? <a href="/contact/">Contact us</a> to discuss integration and licensing options.</p>]]></content><author><name>Upper Bound Robotics</name></author><category term="flight-control" /><category term="machine-learning" /><category term="aerospace" /><category term="uav" /><summary type="html"><![CDATA[How machine learning is transforming flight control systems — from adaptive disturbance rejection to intelligent fault tolerance in autonomous aerial vehicles.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://upperbound.space/assets/images/blog/ml-flight-control.png" /><media:content medium="image" url="https://upperbound.space/assets/images/blog/ml-flight-control.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Vision-Guided Landing: Using Semantic Segmentation for Autonomous Runway Detection</title><link href="https://upperbound.space/blog/2026/03/01/vision-guided-landing-runway-detection/" rel="alternate" type="text/html" title="Vision-Guided Landing: Using Semantic Segmentation for Autonomous Runway Detection" /><published>2026-03-01T00:00:00+00:00</published><updated>2026-03-01T00:00:00+00:00</updated><id>https://upperbound.space/blog/2026/03/01/vision-guided-landing-runway-detection</id><content type="html" xml:base="https://upperbound.space/blog/2026/03/01/vision-guided-landing-runway-detection/"><![CDATA[<p>Landing is the most critical phase of any fixed-wing flight. For manned aircraft, pilots rely on a combination of visual cues, instrument landing systems (ILS), and ground-based navigation aids. But for autonomous UAVs operating in austere environments — remote airstrips, forward operating bases, or improvised runways — these ground-based aids often don’t exist.</p>

<p>This is where vision-guided landing comes in. By using onboard cameras and real-time semantic segmentation, an autonomous UAV can detect, classify, and align with a runway using nothing but its own eyes.</p>

<h2 id="the-problem">The Problem</h2>

<p>Traditional autonomous landing approaches rely on GPS waypoints and pre-programmed glide slopes. This works well on a calm day at a known airfield, but breaks down in real-world conditions:</p>

<ul>
  <li><strong>GPS accuracy</strong> — standard GPS provides 2-5 meter accuracy. For a UAV with a 3-meter wingspan landing on a 15-meter wide runway, that margin is dangerously thin.</li>
  <li><strong>Unknown or damaged runways</strong> — the runway may have obstacles, surface damage, or dimensions that differ from the mission plan.</li>
  <li><strong>GPS-denied environments</strong> — electronic warfare, jamming, or simply operating in areas with poor satellite coverage can make GPS unreliable or unavailable.</li>
  <li><strong>Crosswind alignment</strong> — GPS tells you where you are, not what the runway looks like ahead. Visual alignment is essential for crosswind corrections.</li>
</ul>

<p>A vision-based system solves all of these by directly perceiving the runway in real time.</p>

<h2 id="semantic-segmentation-for-runway-detection">Semantic Segmentation for Runway Detection</h2>

<p>At the core of our approach is a lightweight semantic segmentation model running on the UAV’s embedded compute platform. The model classifies every pixel in the forward-facing camera image into categories:</p>

<ul>
  <li><strong>Runway surface</strong> — paved, gravel, or grass landing strips</li>
  <li><strong>Runway markings</strong> — centerline, threshold, and touchdown zone markings</li>
  <li><strong>Surrounding terrain</strong> — grass, dirt, water, structures, and obstacles</li>
  <li><strong>Sky</strong> — used for horizon reference and attitude validation</li>
</ul>

<p>By segmenting the entire scene at frame rate, the system builds a continuous, pixel-level understanding of the landing environment — far richer than a bounding box detector could provide.</p>

<h2 id="from-pixels-to-flight-commands">From Pixels to Flight Commands</h2>

<p>Raw segmentation output is just a colored image. The real engineering challenge is converting that into actionable flight commands:</p>

<p><strong>Runway geometry extraction.</strong> From the segmented runway mask, we compute the runway centerline, width, length, and orientation in the image frame. Using the camera’s known intrinsics and the UAV’s altitude from the barometric altimeter, we transform these pixel measurements into real-world coordinates.</p>

<p><strong>Glide slope computation.</strong> The detected runway threshold position, combined with the UAV’s current altitude and distance, defines the required glide slope angle. Our controller continuously adjusts pitch to maintain the target glide slope — typically 3 degrees for a standard approach.</p>

<p><strong>Lateral alignment.</strong> The offset between the runway centerline and the image center tells the controller how much lateral correction is needed. This is especially critical in crosswind conditions, where the UAV must crab into the wind while keeping its ground track aligned with the runway.</p>

<p><strong>Flare and touchdown.</strong> In the final seconds before touchdown, the system transitions from glide slope tracking to a flare maneuver — reducing descent rate and pitching up slightly to achieve a smooth touchdown. The segmentation model’s detection of the runway threshold triggers this transition.</p>

<h2 id="why-segmentation-over-detection">Why Segmentation Over Detection</h2>

<p>A common question: why not just use an object detector to find the runway with a bounding box? There are several reasons:</p>

<ul>
  <li><strong>Shape matters.</strong> A bounding box tells you where the runway is, but not its orientation, width, or centerline position. Segmentation gives you the exact shape.</li>
  <li><strong>Partial visibility.</strong> On a long final approach, only part of the runway may be visible. Segmentation handles partial views naturally; detectors struggle.</li>
  <li><strong>Surface condition.</strong> Segmentation can distinguish between usable runway surface and damaged or obstructed areas. A bounding box cannot.</li>
  <li><strong>Sub-pixel precision.</strong> For landing, you need angular accuracy in the hundredths of degrees. Pixel-level segmentation provides this; bounding boxes do not.</li>
</ul>

<h2 id="running-at-the-edge">Running at the Edge</h2>

<p>Our segmentation model is optimized for real-time inference on embedded hardware:</p>

<ul>
  <li><strong>Architecture</strong> — a lightweight encoder-decoder network designed for the segmentation task, not a repurposed ImageNet backbone</li>
  <li><strong>Resolution</strong> — we run at the native camera resolution to preserve the fine details needed for centerline extraction</li>
  <li><strong>Latency</strong> — inference completes in under 15 milliseconds on NVIDIA Jetson-class hardware, well within the control loop timing budget</li>
  <li><strong>Robustness</strong> — trained on diverse lighting conditions, weather, and runway types to handle real-world variability</li>
</ul>

<h2 id="beyond-runways">Beyond Runways</h2>

<p>The same vision-guided approach extends to other precision landing scenarios:</p>

<ul>
  <li><strong>Ship deck landing</strong> — detecting and tracking a moving landing pad on a vessel</li>
  <li><strong>Rooftop landing</strong> — identifying safe landing zones on building rooftops for urban UAV operations</li>
  <li><strong>Field landing</strong> — selecting and aligning with suitable emergency landing sites in open terrain</li>
  <li><strong>Planetary landing</strong> — terrain-relative navigation for spacecraft and Mars/Lunar landers</li>
</ul>

<p>If you’re developing an autonomous fixed-wing platform and need reliable vision-guided landing, <a href="/contact/">contact us</a> to discuss how our perception and flight control systems can integrate with your vehicle.</p>]]></content><author><name>Upper Bound Robotics</name></author><category term="computer-vision" /><category term="segmentation" /><category term="uav" /><category term="flight-control" /><category term="aerospace" /><summary type="html"><![CDATA[How real-time semantic segmentation enables autonomous fixed-wing UAVs to detect runways and execute precision landings without ground-based navigation aids.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://upperbound.space/assets/images/blog/vision-guided-landing.png" /><media:content medium="image" url="https://upperbound.space/assets/images/blog/vision-guided-landing.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry></feed>