Imagine the thrill—and the sheer terror—of launching a $10 billion telescope into space, only to realize its vision isn't quite as sharp as we'd hoped. How do you fine-tune something orbiting over a million kilometers away, where no astronaut can swoop in to adjust the lenses? This is the epic saga of the James Webb Space Telescope (JWST), and trust me, it's a story that will blow your mind. But here's where it gets controversial: is relying on software fixes for hardware flaws the future of space exploration, or does it risk overselling our capabilities? Let's dive in and find out.
Back in late 2021, right after a festive Christmas feast, my family and I were glued to the TV screen, holding our breaths through the unforgettable launch of NASA's James Webb Space Telescope (JWST), costing a staggering $10 billion (around $15 billion in Australian dollars). It marked the biggest leap in telescope tech since the Hubble Space Telescope blasted off in 1990. You can relive that heart-pounding moment on this YouTube clip if you haven't seen it yet.
The journey to deployment was no walk in the park—JWST had to dodge 344 potential failure points, a true 'one-shot-to-do-the-impossible' feat as detailed by Northrop Grumman. Luckily, the launch exceeded expectations, with extra fuel even promising a longer lifespan than anticipated, as NASA scientists explained. We could finally exhale.
Fast-forward six months, and JWST unveiled its debut images of the universe's most remote galaxies, a breathtaking achievement. Yet, for our team Down Under in Australia, this was just the starting line. Our mission? To harness JWST's pinnacle of resolution via a clever gadget called the Aperture Masking Interferometer, or AMI for short. This ingeniously crafted piece of metal, slotted into one of the telescope's cameras, boosts resolution like a high-tech magnifying glass.
We've just published our groundbreaking findings on the open-access platform arXiv in two papers. One details our rigorous testing and enhancements of AMI, while the other showcases its maiden observations of celestial wonders like stars, planets, moons, and even the powerful jets spewing from black holes. It's a testament to human ingenuity working across vast distances.
And this is the part most people miss: operating an instrument from over a million kilometers away demands a whole new playbook.
Remember Hubble? It launched with a blurry view because its mirror was polished with pinpoint accuracy but in the wrong shape. Astronomers fixed it by studying familiar stars and crafting a corrective 'prescription,' much like an eye doctor refining your glasses. That meant sending seven astronauts aboard the Space Shuttle Endeavour in 1993 to install fresh optics. Hubble orbits just a few hundred kilometers from Earth, making astronaut visits feasible.
JWST, however, sits about 1.5 million kilometers out—way beyond reach. No hardware swaps or crewed repairs here. We had to innovate with software and smart design.
Enter AMI, the lone Australian component aboard JWST, conceived by astronomer Peter Tuthill from the University of Sydney. Its role? To spot and quantify any fuzziness in JWST's images. Even tiny distortions—down to nanometers—in the telescope's 18 hexagonal primary mirrors or internal optics could blur pictures enough to obscure planets or black holes, where pinpoint accuracy is everything.
AMI achieves this by filtering light through a patterned array of holes in a simple metal plate, simplifying the detection of optical glitches. It's like adding a diagnostic tool to your camera to check for imperfections.
We aimed to use AMI for thrilling pursuits, like studying planet-forming regions or matter spiraling into black holes. But first, AMI revealed JWST wasn't performing perfectly. At pixel-level detail, images suffered from slight blurriness caused by an electronic quirk: brighter pixels bleeding into darker neighbors. This isn't a defect per se, but a natural trait of infrared cameras that proved more disruptive than expected for JWST.
For spotting exoplanets thousands of times dimmer than their host stars and just a few pixels apart, this blurred vision was a major setback. My colleagues' analyses showed the limits were over ten times worse than anticipated—potentially derailing our ambitions.
So, we rolled up our sleeves to rectify it.
In a fresh arXiv paper steered by University of Sydney PhD student Louis Desdoigts, we examined stars via AMI to tackle both optical and electronic distortions at once. We developed a computer simulation capturing AMI's physics, adaptable to mirror shapes, apertures, and star colors. Linking this to a machine learning algorithm created an 'effective detector model' focused on data accuracy rather than the underlying causes.
After training on test stars and validating results, we could compute and reverse the blur in other datasets. Crucially, this correction happens post-capture during data processing—it refines the information we receive without altering JWST's hardware in space.
The payoff? Spectacular. Take HD 206893, a star hosting a faint planet and the reddest known brown dwarf (an object straddling star and planetary status). Previously invisible to JWST, they now stand out vividly in our corrected maps, like hidden treasures revealed.
This breakthrough unlocks AMI's potential for hunting undiscovered worlds at unprecedented resolutions and sensitivities, potentially revolutionizing exoplanet research.
But here's where it gets controversial: isn't this over-relying on digital wizardry to compensate for physical shortcomings? Could it set unrealistic expectations for future missions?
Our second paper, authored by University of Sydney PhD student Max Charles, extends this beyond mere 'dots'—even planetary ones—to reconstruct intricate images at JWST's utmost resolution. We re-examined challenging targets to push the telescope's boundaries.
With the fix, Jupiter's volcanic moon Io came into sharp focus, its erupting volcanoes tracked in a mesmerizing hour-long timelapse as it spun. AMI's view of the jet erupting from the black hole in NGC 1068's galactic core matched visuals from larger ground-based telescopes. And a delicate dust ribbon around the binary star WR 137—a subtler version of the stunning Apep system—aligned perfectly with theoretical predictions.
The AMI software serves as a prototype for advanced cameras on JWST and its successor, the Roman Space Telescope. These demand optical precision finer than a nanometer, surpassing current material limits. Our research demonstrates that by meticulously measuring, controlling, and adjusting available materials, we can still aspire to detect Earth-like planets across the galaxy's expanse.
What do you think? Should we celebrate this triumph of remote ingenuity, or worry that it's masking deeper issues in space tech? Is software the savior of astronomy, or just a Band-Aid on a bullet hole? Share your thoughts in the comments—do you agree, disagree, or have a wild counterpoint to add? We'd love to hear from you!
- Benjamin Pope is an Associate Professor in the School of Mathematical and Physical Sciences at Macquarie University.
- This article first appeared in The Conversation.