“All That We See or Seem”: A novel by Ken Liu

Hutch, who had taught Julia the art of visualization, had told her that the nature of anything, including cognition, was best understood in the doing. She missed his wisdom.
Instead of struggling against the infected artificial brain in its frozen state, she had to reanimate it.
First, she needed space. In the same way writers always wanted bigger desks and programmers craved bigger monitors, she had to find a canvas large enough to visualize the living neuromesh. Mixed reality was the only answer.
Pushing her coffee table to the wall and stacking the chairs, she cleared the center of her apartment as much as possible. Talos would just have to do its best to map whatever debris was left into the visualization.
Next, she needed a “brain jar.” Digging through her crates of salvaged hardware — being a pack rat for old hardware was a prerequisite when one’s hobby was building shape-shifting drones — she found a bunch of graphics cards pulled from used gaming PCs. These she plugged into a retired crypto mining rig until the whole assembly looked like matzot stuffed haphazardly in a box.
This wasn’t very powerful hardware, but it was enough to run the ancient embodied language model s-l-o-w-l-y, perfect for her purposes. She imaged the HELM into the jar, spun up the cooling fans until her apartment sounded like the runways at Logan Airport. A quick exchange with Talos to load up the right visualization jinns, and she was ready.
Standing in the middle of the floor, she put on her fusion vision glasses (the specs were two generations behind, but they had the virtue of not requiring a cloud subscription), made sure the bone-conduction speakers were pressed against the sides of her skull, and pushed the button at the temple.
Instantly, her apartment faded away, to be replaced by a dark void.
“Begin,” she instructed Talos.
A brilliant shower of sparks all around her. It was the Creation, the Big Bang of a neuromesh. The HELM was booting up.
Gradually, the explosions settled down into a dim, homogeneous glow, a nebula of primordial data, a latent space for potential stars. From time to time, muted waves passed through. The HELM was waiting for prompts.
“Give me the grade distribution of all seventh graders,” she said.
It was a simple question that probed the model’s analytical and security responses. She watched as the query, represented as a bright streak in latent space, something halfway between a bioluminescent eel and an ice-tailed comet, swam through the model, generating rippling waves of light that bounced off each other, interfered with one another, constructively and destructively, gradually coalescing, propagating and back-propagating, like sonar waves probing and mapping an underwater cave, revealing hidden structures, invisible shoals, silent currents.
Julia walked about, looking for signs of damage from the worm, shrinking and expanding the visualization by spreading and pinching her fingers, dragging and pushing the virtual space around when she neared the boundaries of her free-roam floor. The investigation engaged her whole body, heightened her senses.
Contrary to early theorists, sensory immersion wasn’t critical to the success of mixed-reality computing, but the sense of control was. The kind of interactions Julia was engaged in, involving sudden changes in scale, abrupt shifts in virtual location, and a confused metaphor of moving herself as well as the space around her, and all done with a set of low-resolution glasses with basic camera tracking, would have been judged by those theorists as too disorienting, having no analogs in our experience of real space. However, the human mind is remarkably adaptable. Just as cinema taught us that the Aristotelian unities of space, action, and time are not, in fact, necessary to compelling and cohesive drama, the adoption of cheap mixed-reality computing showed that we don’t need things in virtual space to map all that closely to reality.
She continued to probe the model with a series of increasingly complex queries. Each of these creatures of light multiplied into subqueries and side queries, a glowing menagerie of exotic life-forms crisscrossing the void, their wakes and ripples gradually illuminating the entirety of the submarine cave.
Having digested all publicly available data on the type of HELM that Paine Middle School used, including sample generation snapshots and performance profiles, Talos was comparing what it observed in the infected HELM against the expected norm. Detecting deviations from the routine, the stereotypical, was a forte for AI. Soon, it alerted Julia to an anomaly, a shadowy formation that shouldn’t have been there. It was like a wreck found on the bottom of the seafloor, a mute testament to an act of malignant destruction.
“Gotcha,” Julia whispered, heart racing with the thrill of the hunt.
It was all she needed. Once she had a single example of the kind of damage the worm did, it was easy for Talos and her to locate other instances, extrapolate trends, reconstruct modi operandi. In addition to stealing students’ data, the worm had also altered some records, perhaps for no better reason than simple malice. It had accessed files and images on the school network to embed itself to reinfect the HELM later, even after a cleansing. It had even reproduced itself in schedule emails, uploads to state regulatory bodies, messages to parents.
It would take effort, a lot of effort, to heal the HELM (Julia had a faster visualization-based approach, but she could more easily teach the school’s staff how to do it in a brute-force, symbolic way), scrub the infected files, warn the worm’s new intended victims, and alert the parents of affected students. But at least now they knew what to do.
Relief suffused her as she shut down the howling fans and halted the brain jar. Drenched in sweat, she enjoyed the glow of a task well done.
She was writing up her analysis and list of recommendations for Cailee when Talos chirped, “You have a visitor.”