DAY 6 8:12 A.M.
We came into a large room marked UTILITY and beneath it, MOLSTOCK/FABSTOCK/FEEDSTOCK. The walls and ceiling were covered with the familiar smooth plastic laminate. Large laminated containers were stacked on the floor. Off to the right I saw a row of big stainless-steel kettles, sunk below ground with lots of piping and valves surrounding them, and coming up to the first-floor level. It looked exactly like a microbrewery, and I was about to ask Ricky about it when he said, “So there you are!” Working at a junction1 box beneath a monitor screen were three more members of my old team. They looked slightly guilty as we came up, like kids caught with their hands in the cookie jar. Of course Bobby Lembeck was their leader. At thirty-five, Bobby now supervised more code than he wrote, but he could still write when he wanted to. As always, he was wearing faded jeans and a Ghost in the Shell T-shirt, his ubiquitous Walkman clamped to his waist. Then there was Mae Chang, beautiful and delicate, about as different from Rosie Castro as any woman could be. Mae had worked as a field biologist in Sichuan studying the golden snub-nosed monkey before turning to programming in her mid-twenties. Her time in the field, as well as her natural inclination2, led her to be almost silent. Mae said very little, moved almost soundlessly, and never raised her voice—but she never lost an argument, either. Like many field biologists, she had developed the uncanny ability to slip into the background, to become unnoticed, almost to vanish.
And finally Charley Davenport, grumpy, rumpled3, and already overweight at thirty. Slow and lumbering4, he looked as if he had slept in his clothes, and in fact he often did, after a marathon programming session. Charley had worked under John Holland in Chicago and Doyne Farmer at Los Alamos. He was an expert in genetic5 algorithms, the kind of programming that mimicked6 natural selection to hone answers. But he was an irritating personality—he hummed, he snorted, talked to himself, and farted with noisy abandon. The group only tolerated him because he was so talented.
“Does it really take three people to do this?” Ricky said, after I’d shaken hands all around.
“Yes,” Bobby said, “it does take three people, El Rooto, because it’s complicated.”
“Why? And don’t call me El Rooto.”
“I obey, Mr. Root.”
“Just get on with it ...”
“Well,” Bobby said, “I started to check the sensors7 after this morning’s episode, and it looks to me like they’re miscalibrated. But since nobody is going outside, the question is whether we’re reading them wrong, or whether the sensors themselves are faulty, or just scaled wrong on the equipment in here. Mae knows these sensors, she’s used them in China. I’m making code revisions now. And Charley is here because he won’t go away and leave us alone.”
“Shit, I have better things to do,” Charley said. “But I wrote the algorithm that controls the sensors, and we need to optimize9 the sensor8 code after they’re done. I’m just waiting until they stop screwing around. Then I’ll optimize.” He looked pointedly10 at Bobby. “None of these guys can optimize worth a damn.”
Mae said, “Bobby can.”
“Yeah, if you give him six months, maybe.”
“Children, children,” Ricky said. “Let’s not make a scene in front of our guest.” I smiled blandly12. The truth was, I hadn’t been paying attention to what they were saying. I was just watching them. These were three of my best programmers—and when they had worked for me, they had been self-assured to the point of arrogance13. But now I was struck by how nervous the group was. They were all on edge, bickering14, jumpy. And thinking back, I realized that Rosie and David had been on edge, too.
Charley started humming in that irritating way of his.
“Oh, Christ,” Bobby Lembeck said. “Would you tell him to shut up?”
Ricky said, “Charley, you know we’ve talked about the humming.”
Charley continued to hum.
“Charley ...”
Charley gave a long, theatrical15 sigh. He stopped humming.
“Thank you,” Bobby said.
Charley rolled his eyes, and looked at the ceiling.
“All right,” Ricky said. “Finish up quickly, and get back to your stations.”
“Okay, fine.”
“I want everybody in place as soon as possible.”
“Okay,” Bobby said.
“I’m serious. In your places.”
“For Christ’s sake, Ricky, okay, okay. Now will you stop talking and let us work?” Leaving the group behind, Ricky took me across the floor to a small room. I said, “Ricky, these kids aren’t the way they were when they worked for me.”
“I know. Everybody’s a little uptight16 right now.”
“And why is that?”
“Because of what’s going on here.”
“And what is going on here?”
He stopped before a small cubicle17 on the other side of the room. “Julia couldn’t tell you, because it was classified.” He touched the door with a keycard. I said, “Classified? Medical imaging is classified?”
The door latch18 clicked open, and we went inside. The door closed behind us. I saw a table, two chairs, a computer monitor and a keyboard. Ricky sat down, and immediately started typing. “The medical imaging project was just an afterthought,” he said, “a minor19 commercial application of the technology we are already developing.”
“Uh-huh. Which is?”
“Military.”
“Xymos is doing military work?”
“Yes. Under contract.” He paused. “Two years ago, the Department of Defense20 realized from their experience in Bosnia that there was enormous value to robot aircraft that could fly overhead and transmit battlefield images in real time. The Pentagon knew that there would be more and more sophisticated uses for these flying cameras in future wars. You could use them to spot the locations of enemy troops, even when they were hidden in jungle or in buildings; you could use them to control laser-guided rocket fire, or to identify the location of friendly troops, and so on. Commanders on the ground could call up the images they wanted, in the spectra21 they wanted—visible, infrared22, UV, whatever. Real-time imaging was going to be a very powerful tool in future warfare23.”
“Okay ...”
“But obviously,” Ricky said, “these robot cameras were vulnerable. You could shoot them down like pigeons. The Pentagon wanted a camera that couldn’t be shot down. They imagined something very small, maybe the size of a dragonfly—a target too small to hit. But there were problems with power supply, with small control surfaces, and with resolution using such a small lens. They needed a bigger lens.”
I nodded. “And so you thought of a swarm24 of nanocomponents.”
“That’s right.” Ricky pointed11 to the screen, where a cluster of black spots wheeled and turned in the air, like birds. “A cloud of components25 would allow you to make a camera with as large a lens as you wanted. And it couldn’t be shot down because a bullet would just pass through the cloud. Furthermore, you could disperse26 the cloud, the way a flock of birds disperses27 with a gunshot. Then the camera would be invisible until it re-formed again. So it seemed an ideal solution. The Pentagon gave us three years of DARPA funding.”
“And?”
“We set out to make the camera. It was of course immediately obvious that we had a problem with distributed intelligence.”
I was familiar with the problem. The nanoparticles in the cloud had to be endowed with a rudimentary intelligence, so that they could interact with each other to form a flock that wheeled in the air. Such coordinated28 activity might look pretty intelligent, but it occurred even when the individuals making up the flock were rather stupid. After all, birds and fish could do it, and they weren’t the brightest creatures on the planet.
Most people watching a flock of birds or a school of fish assumed there was a leader, and that all the other animals followed the leader. That was because human beings, like most social mammals, had group leaders.
But birds and fish had no leaders. Their groups weren’t organized that way. Careful study of flocking behavior—frame-by-frame video analysis—showed that, in fact, there was no leader. Birds and fish responded to a few simple stimuli29 among themselves, and the result was coordinated behavior. But nobody was controlling it. Nobody was leading it. Nobody was directing it.
Nor were individual birds genetically30 programmed for flocking behavior. Flocking was not hard-wired. There was nothing in the bird brain that said, “When thus-and-such happens, start flocking.” On the contrary, flocking simply emerged within the group as a result of much simpler, low-level rules. Rules like, “Stay close to the birds nearest you, but don’t bump into them.” From those rules, the entire group flocked in smooth coordination31. Because flocking arose from low-level rules, it was called emergent behavior. The technical definition of emergent behavior was behavior that occurred in a group but was not programmed into any member of the group. Emergent behavior could occur in any population, including a computer population. Or a robot population. Or a nanoswarm.
I said to Ricky, “Your problem was emergent behavior in the swarm?”
“Exactly.”
“It was unpredictable?”
“To put it mildly.”
In recent decades, this notion of emergent group behavior had caused a minor revolution in computer science. What that meant for programmers was that you could lay down rules of behavior for individual agents, but not for the agents acting32 together. Individual agents—whether programming modules33, or processors, or as in this case, actual micro-robots—could be programmed to cooperate under certain circumstances, and to compete under other circumstances. They could be given goals. They could be instructed to pursue their goals with single-minded intensity35, or to be available to help other agents. But the result of these interactions could not be programmed. It just emerged, with often surprising outcomes.
In a way this was very exciting. For the first time, a program could produce results that absolutely could not be predicted by the programmer. These programs behaved more like living organisms than man-made automatons36. That excited programmers—but it frustrated37 them, too. Because the program’s emergent behavior was erratic38. Sometimes competing agents fought to a standstill, and the program failed to accomplish anything. Sometimes agents were so influenced by one another that they lost track of their goal, and did something else instead. In that sense the program was very childlike—unpredictable and easily distracted. As one programmer put it, “Trying to program distributed intelligence is like telling a five-year-old kid to go to his room and change his clothes. He may do that, but he is equally likely to do something else and never return.”
Because these programs behaved in a lifelike way, programmers began to draw analogies to the behavior of real organisms in the real world. In fact, they began to model the behavior of actual organisms as a way to get some control over program outcomes. So you had programmers studying ant swarming39, or termite40 mounding, or bee dancing, in order to write programs to control airplane landing schedules, or package routing, or language translation. These programs often worked beautifully, but they could still go awry41, particularly if circumstances changed drastically. Then they would lose their goals. That was why I began, five years ago, to model predator-prey relationships as a way to keep goals fixed42. Because hungry predators43 weren’t distracted. Circumstances might force them to improvise44 their methods; and they might try many times before they succeeded—but they didn’t lose track of their goal.
So I became an expert in predator-prey relationships. I knew about packs of hyenas45, African hunting dogs, stalking lionesses, and attacking columns of army ants. My team had studied the literature from the field biologists, and we had generalized those findings into a program module34 called PREDPREY, which could be used to control any system of agents and make its behavior purposeful. To make the program seek a goal.
Looking at Ricky’s screen, the coordinated units moving smoothly46 as they turned through the air, I said, “You used PREDPREY to program your individual units?”
“Right. We used those rules.”
“Well, the behavior looks pretty good to me,” I said, watching the screen. “Why is there a problem?”
“We’re not sure.”
“What does that mean?”
“It means we know there’s a problem, but we’re not sure what’s causing it. Whether the problem is programming—or something else.”
“Something else? Like what?” I frowned. “I don’t get it, Ricky. This is just a cluster of microbots. You can make it do what you want. If the programming’s not right, you adjust it. What don’t I understand?”
Ricky looked at me uneasily. He pushed his chair away from the table and stood. “Let me show you how we manufacture these agents,” he said. “Then you’ll understand the situation better.” Having watched Julia’s demo tape, I was immensely curious to see what he showed me next. Because many people I respected thought molecular47 manufacturing was impossible. One of the major theoretical objections was the time it would take to build a working molecule48. To work at all, the nanoassembly line would have to be far more efficient than anything previously49 known in human manufacturing. Basically, all man-made assembly lines ran at roughly the same speed: they could add one part per second. An automobile50, for example, had a few thousand parts. You could build a car in a matter of hours. A commercial aircraft had six million parts, and took several months to build.
But a typical manufactured molecule consisted of 1025 parts. That was 10,000,000,000,000,000,000,000,000 parts. As a practical matter, this number was unimaginably large. The human brain couldn’t comprehend it. But calculations showed that even if you could assemble at the rate of a million parts per second, the time to complete one molecule would still be 3,000 trillion years—longer than the known age of the universe. And that was a problem. It was known as the build-time problem.
I said to Ricky, “If you’re doing industrial manufacturing ...”
“We are.”
“Then you must have solved the build-time problem.”
“We have.”
“How?”
“Just wait.”
Most scientists assumed this problem would be solved by building from larger subunits, molecular fragments consisting of billions of atoms. That would cut the assembly time down to a couple of years. Then, with partial self-assembly, you might get the time down to several hours, perhaps even one hour. But even with further refinements51, it remained a theoretical challenge to produce commercial quantities of product. Because the goal was not to manufacture a single molecule in an hour. The goal was to manufacture several pounds of molecules52 in an hour. No one had ever figured out how to do that.
We passed a couple of laboratories, including one that looked like a standard microbiology lab, or a genetics lab. I saw Mae standing53 in that lab, puttering around. I started to ask Ricky why he had a microbiology lab here, but he brushed my question aside. He was impatient now, in a hurry. I saw him glance at his watch. Directly ahead was a final glass airlock. Stenciled54 on the glass door was MicroFabrication. Ricky waved me in. “One at a time,” he said. “That’s all the system allows.”
I stepped in. The doors hissed55 shut behind me, the pressure pads again thunking shut. Another blast of air: from below, from the sides, from above. By now I was getting used to it. The second door opened, and I walked forward down another short corridor, opening into a large room beyond. I saw bright, shining white light—so bright it hurt my eyes. Ricky came after me, talking as we walked, but I don’t remember what he said. I couldn’t focus on his words. I just stared. Because by now I was inside the main fab building—a huge windowless space, like a giant hangar three stories high. And within this hangar stood a structure of immense complexity56 that seemed to hang in midair, glowing like a jewel.
1 junction | |
n.连接,接合;交叉点,接合处,枢纽站 | |
参考例句: |
|
|
2 inclination | |
n.倾斜;点头;弯腰;斜坡;倾度;倾向;爱好 | |
参考例句: |
|
|
3 rumpled | |
v.弄皱,使凌乱( rumple的过去式和过去分词 ) | |
参考例句: |
|
|
4 lumbering | |
n.采伐林木 | |
参考例句: |
|
|
5 genetic | |
adj.遗传的,遗传学的 | |
参考例句: |
|
|
6 mimicked | |
v.(尤指为了逗乐而)模仿( mimic的过去式和过去分词 );酷似 | |
参考例句: |
|
|
7 sensors | |
n.传感器,灵敏元件( sensor的名词复数 ) | |
参考例句: |
|
|
8 sensor | |
n.传感器,探测设备,感觉器(官) | |
参考例句: |
|
|
9 optimize | |
v.使优化 [=optimise] | |
参考例句: |
|
|
10 pointedly | |
adv.尖地,明显地 | |
参考例句: |
|
|
11 pointed | |
adj.尖的,直截了当的 | |
参考例句: |
|
|
12 blandly | |
adv.温和地,殷勤地 | |
参考例句: |
|
|
13 arrogance | |
n.傲慢,自大 | |
参考例句: |
|
|
14 bickering | |
v.争吵( bicker的现在分词 );口角;(水等)作潺潺声;闪烁 | |
参考例句: |
|
|
15 theatrical | |
adj.剧场的,演戏的;做戏似的,做作的 | |
参考例句: |
|
|
16 uptight | |
adj.焦虑不安的,紧张的 | |
参考例句: |
|
|
17 cubicle | |
n.大房间中隔出的小室 | |
参考例句: |
|
|
18 latch | |
n.门闩,窗闩;弹簧锁 | |
参考例句: |
|
|
19 minor | |
adj.较小(少)的,较次要的;n.辅修学科;vi.辅修 | |
参考例句: |
|
|
20 defense | |
n.防御,保卫;[pl.]防务工事;辩护,答辩 | |
参考例句: |
|
|
21 spectra | |
n.光谱 | |
参考例句: |
|
|
22 infrared | |
adj./n.红外线(的) | |
参考例句: |
|
|
23 warfare | |
n.战争(状态);斗争;冲突 | |
参考例句: |
|
|
24 swarm | |
n.(昆虫)等一大群;vi.成群飞舞;蜂拥而入 | |
参考例句: |
|
|
25 components | |
(机器、设备等的)构成要素,零件,成分; 成分( component的名词复数 ); [物理化学]组分; [数学]分量; (混合物的)组成部分 | |
参考例句: |
|
|
26 disperse | |
vi.使分散;使消失;vt.分散;驱散 | |
参考例句: |
|
|
27 disperses | |
v.(使)分散( disperse的第三人称单数 );疏散;驱散;散布 | |
参考例句: |
|
|
28 coordinated | |
adj.协调的 | |
参考例句: |
|
|
29 stimuli | |
n.刺激(物) | |
参考例句: |
|
|
30 genetically | |
adv.遗传上 | |
参考例句: |
|
|
31 coordination | |
n.协调,协作 | |
参考例句: |
|
|
32 acting | |
n.演戏,行为,假装;adj.代理的,临时的,演出用的 | |
参考例句: |
|
|
33 modules | |
n.模块( module的名词复数 );单元;(宇宙飞船上各个独立的)舱;组件 | |
参考例句: |
|
|
34 module | |
n.组件,模块,模件;(航天器的)舱 | |
参考例句: |
|
|
35 intensity | |
n.强烈,剧烈;强度;烈度 | |
参考例句: |
|
|
36 automatons | |
n.自动机,机器人( automaton的名词复数 ) | |
参考例句: |
|
|
37 frustrated | |
adj.挫败的,失意的,泄气的v.使不成功( frustrate的过去式和过去分词 );挫败;使受挫折;令人沮丧 | |
参考例句: |
|
|
38 erratic | |
adj.古怪的,反复无常的,不稳定的 | |
参考例句: |
|
|
39 swarming | |
密集( swarm的现在分词 ); 云集; 成群地移动; 蜜蜂或其他飞行昆虫成群地飞来飞去 | |
参考例句: |
|
|
40 termite | |
n.白蚁 | |
参考例句: |
|
|
41 awry | |
adj.扭曲的,错的 | |
参考例句: |
|
|
42 fixed | |
adj.固定的,不变的,准备好的;(计算机)固定的 | |
参考例句: |
|
|
43 predators | |
n.食肉动物( predator的名词复数 );奴役他人者(尤指在财务或性关系方面) | |
参考例句: |
|
|
44 improvise | |
v.即兴创作;临时准备,临时凑成 | |
参考例句: |
|
|
45 hyenas | |
n.鬣狗( hyena的名词复数 ) | |
参考例句: |
|
|
46 smoothly | |
adv.平滑地,顺利地,流利地,流畅地 | |
参考例句: |
|
|
47 molecular | |
adj.分子的;克分子的 | |
参考例句: |
|
|
48 molecule | |
n.分子,克分子 | |
参考例句: |
|
|
49 previously | |
adv.以前,先前(地) | |
参考例句: |
|
|
50 automobile | |
n.汽车,机动车 | |
参考例句: |
|
|
51 refinements | |
n.(生活)风雅;精炼( refinement的名词复数 );改良品;细微的改良;优雅或高贵的动作 | |
参考例句: |
|
|
52 molecules | |
分子( molecule的名词复数 ) | |
参考例句: |
|
|
53 standing | |
n.持续,地位;adj.永久的,不动的,直立的,不流动的 | |
参考例句: |
|
|
54 stenciled | |
v.用模板印(文字或图案)( stencil的过去式和过去分词 ) | |
参考例句: |
|
|
55 hissed | |
发嘶嘶声( hiss的过去式和过去分词 ); 发嘘声表示反对 | |
参考例句: |
|
|
56 complexity | |
n.复杂(性),复杂的事物 | |
参考例句: |
|
|
欢迎访问英文小说网 |