After obtaining the two Longinus, Patchouli lent them to Zeroy that very night so she could begin her own experiment.
She was convinced that her approach could let everyone live in a beautiful environment and shape a perfect utopia.
However, without testing, it would remain just a theory.
What's more, the matter was of great importance, so she had to be extremely cautious.
Since she didn't yet have the conditions to run a true experiment, she would first use the power of the Longinuses to run a simulation.
The first simulated location was set on an island inside Little Hell.
Using [Azure Box of Revolution], she created a modern society. Due to the island's size, this society had only twenty thousand people.
Then she created an advanced AI to monitor, protect, and restrain people's criminal behavior.
Finally, using the rule-altering power of [Ultimate Karmic Wheel], she accelerated time in this society hundreds of times over.
The miniature world immediately began its cycle of day and night, and people's movements became almost mere afterimages.
As expected, after only a few minutes—half a month in the miniature world—the world collapsed.
The people rioted, paying a heavy price to destroy the AI's central control core.
The reason was that the AI's logic wasn't rigorous enough, and Zeroy's envisioned absolute fairness, reason, and correctness never appeared.
The AI made incorrect judgments and ultimately incurred the hostility and resistance of the people.
Zeroy felt a little disappointed, yet not heartbroken.
This result was within her expectations.
Although she had already tried to perfect the AI's logic beforehand, she also knew that wasn't enough.
The variables and situations in reality were too changeable. The first generation of AI couldn't possibly be absolutely correct.
Next came improvement based on failure.
In the second simulation, the people rioted on the tenth day, still because the AI's logic design wasn't perfect.
Third time... fourth time... fifth time...
By the sixth time, before the AI even made a mistake, humanity itself had issues.
A simple reason: bad people appeared among the masses. A high-IQ criminal exploited the AI, causing a severe incident. Although it was eventually quelled, people lost trust in the AI because of it.
Continue improving, continue simulating.
Seventh time, unequal resource distribution caused conflict, and the AI's rigid justice instead produced injustice.
By the thirtieth simulation, the lifespan of human society had basically stabilized at about three years.
This progress was actually not much.
Because three years is far too short for the lifespan of a society.
Placed in the river of history, it isn't even a flash in the pan.
Although Zeroy had solved many problems through simulation, countless more still lay before her.
When the scope of the simulation expanded from a small island to a modern metropolis, the speed of social collapse multiplied, and the problems were endless.
In the simulation, her most direct feeling was—there are too many problems. Why so many troubles? Why are you all so complicated!
Yet when she looked closely at those problems, she realized they weren't being difficult or making trouble for nothing; they were reasonable, normal demands.
Her settings were simply not comprehensive or refined enough, which caused the issues.
It left her depressed, uncomfortable, with no place to vent.
She could only bury her head and make revisions.
Right now, she felt like she was facing a tangle of threads with no end.
Of course, she could use absolute power to force everyone to behave, even stipulate exactly what they could do or think each day.
Yet that wasn't what she wanted. That would turn humans into machines, screws and gears in this mechanical fortress called utopia.
Too much freedom brings chaos, while too much discipline and restriction kills people in spirit and soul.
She only wanted to restrict people from committing crimes, to let them live honest, rich, and carefree lives.
To make people genuinely acknowledge this utopia from the heart.
Forcing people to acknowledge it was not what she wanted.
She would rule by fear, and didn't mind patrol robots with guns watching everyone, yet that wasn't to force them to accept the society—it was just to keep them from committing crimes and destroying others' rightful happiness.
Simple stability had no meaning. Whether people were happy was the most important thing.
Stability, harmony—they're just byproducts, side effects, of people becoming happy.
A stable society filled with anger and resentment was absolutely not what Zeroy wanted, nor could it be called a utopia.
Just like she'd always done, her bloodshed and brutality were only for pests; toward normal innocents, aside from scaring them not to become pests, Zeroy had no malice.
So she could only keep correcting errors, trying all sorts of directions.
From improving AI logic, to social resource distribution, to social structure and legal formulation—on and on.
She continued simulation after simulation until dawn—
Zeroy had already run hundreds of simulations, and by the middle of the night she had even developed new uses of the two Longinus to simulate multiple societies at once, increasing the number of observation samples.
The simulated societies' lifespan had reached ten years.
For the first time, Zeroy showed a tired appearance, the kind of exhaustion of body and mind.
Still, she had no intention of stopping.
She could see the progress. Even though there were countless obstacles, as many as the stars, she was still full of drive.
"A grand vision. And your attitude doesn't seem like just talk. You might actually hit it off with that old good guy, yet—"
Ddraig the Red Dragon Emperor had been watching all night as well.
Patchouli was also there.
While Zeroy conducted the simulation experiments, they stayed by her side, constantly offering advice.
"You should go rest. Never mind exhausting yourself—what I worry about is that this way you'll easily fall into a mental trap, into a dead end of thought."
"Yes, let's stop here for today." Patchouli closed the grimoire she had been using to assist Zeroy's calculations.
"Endless thinking won't necessarily find the answer, and the problem you're facing isn't one that can be solved in a short time. It's a composite of countless questions."
"If you keep going like this, you won't get the answer, and your thinking will only become more inertial. That's not good for your reasoning."
"Why don't we take a break and talk a bit? Zeroy, what exactly is your view of utopia?"
...
If you want to support, please consider Patreon, and read advanced chapters! Your support keeps this series going!
[Patreon.com/RedZTL]
