Frenetic Array

The intersection of logic and imagination.

On Being Passionate About Tools

Get to work. Grab a cup of coffee, some water. Possibly breakfast.

Sit down at my desk. Log into my computer, looks at i3wm.

Launch a new terminal session. ranger. Enter. Navigate to project. nvim. Enter.

space f. Use fzf to open necessary file.

This is a daily workflow for me. Before I start typing code, I use about four tools to get to it. Notice, I haven’t mentioned touching the mouse once yet; Because I haven't touched the mouse yet.

Throughout my academic and professional career as a software engineer, I spent a lot of time using tools. Trying new tools. Searching for tools. Customizing tools[1].

Although this might be a commendable feat, it is often met with skepticism. A lot of skepticism.

I’ve had at least a hundred conversations with people about tools, about why they advocate against using such tools. These conservations were with other students, senior software engineers, and tenured professors; and generally, almost all of them fell into three, distinct buckets of people.

It’s a waste of time.

There is by far the worst yet best counterargument against investing so heavily into tools; and the reasoning is in the trailing, three words.

waste. Better tools will objectively help one write more code. If there is a modification that needs to be made at the end of the file, it takes one keystroke to jump to that positions in Vim. Picture doing this, but 100s of time a day.

of time. There is a time commitment, a big one. And there is a learning curve, and it is steep. So, one would have to justify the reasons for learning it. And I’ve done so below.

What I have works just fine.

I’m deeply skeptical of the “If it’s not broken, don’t fix it” mentality. There’s always room for improvement, and there’s nothing more I love to see than improvement.

It’s why we buy new phones. It’s why we like nice, new cars. It’s why we, as civilization, love progress.

Even if it’s a marginal increase, say 5% more code a week. With a 100 lines of code written a week, that can be a about 250 additional lines of code written a year. A marginal increase, sure; but a welcomed one.

It doesn’t really make you a better programmer.

I generally have three core reasons against this, and they are as follows.

It’ll help write more code, faster. It is objectively faster to type more code if your hands never leave the keyboard. It’s faster to navigate a filesystem without having to focus on where a cursor is. It’s ridiculously faster to open a file to just type out parts of the filename apposed to navigating to it.

It’s healthier. Legitimately. Minimizing hand travel will significantly put less strain on your hands, wrists, and arms.

It’ll make you a better programmer. As programmers, we have to learn things all the time. New programming languages, new paradigms, new company software. Learning a new tool keeps a mind sharp by reinforcing how to learn something. Plus, the skill is fluid. Learning to navigate in vim will help navigate in ranger. Learning regular expressions can be used in fuzzy searches. The skill are transferable.

So, where might one start? Again, there are three areas I see that can have the most benefit from improving your workflow.

Your operating system. Whether it’s Windows, MacOS, or Linux, you have to use an operating system. Get good at it. Learn the paradigms of the system. Learn how the file explorer works in Windows. Learn advanced features in MacOS like a spotlight alternative or simple hot corners. Or learn how to set up your WiFi drivers in Linux.

Your text editor. Choose one’s that portable. I use NeoVim. Many people have success with Emacs. Some people prefer a visual editor like Sublime Text or Atom. Learn their workflows. Learn the shortcuts of Emacs. Learn how to edit text in Normal Mode using Vim. Install packages in Sublime or Atom. You spend most of your time here anyway, might as well get familiar with it.

The intermediary layer between your operating system and your text editor. By this, I usually mean the shell, like Bash or ZSH. Learn how to grep or navigate around. Install packages to make tasks easier. It’s well worth it.

So, yes, I am a strong advocate of using advanced tools. Established tools. Great tools. And I hope I’ve convinced you to do the same.


  1. 145 commits worth. ↩︎

Painting Gabriel’s Horn

There are many things in the world of mathematics that are really quite wonderful — however, I am not sure there will be anything more wonderful yet unintuitive than Gabriel's Horn.

Gabriel's Horn is thus: suppose you have the function $y = \frac{1}{x}$ where $x \in \mathbb{R}^+, 1 \leq x \leq \infty$, rotated around the $x$ axis; not too difficult to conceptualize, it looks like a horn of sorts. But here's the paradox.

Suppose we want to calculate the volume. Simple enough, using solids of revolution, we can show the volume to be:

$$V = \pi \lim_{t \rightarrow \infty} \int _1 ^t \frac{1}{x^2} dx = \pi \lim _{t \rightarrow \infty} ( 1 - \frac{1}{t} ) = \pi $$

A simple, elegant solution; we can expect the volume to be exactly $\pi$. So, let's see about the surface area.

We know the general definition of the arc length to be $\int _a ^b \sqrt{1 + f'(x)^2}$, so combining this with our solids of revolution, we should get

$$A = 2\pi \lim _{t \rightarrow \infty} \int _1 ^t \frac{1}{x} \sqrt{1 + \left( -\frac{1}{x^2} \right)^2 } dx $$

However, this is not a trivial integral; however, there is a trick we can do. Suppose we take the integral $$2\pi \lim _{t \rightarrow \infty} \int _1 ^t \frac{dx}{x}$$ instead, and we can prove this integral will always be equal to or smaller than the former integral (because of the disappearance of $\sqrt{1 + (-\frac{1}{x^2})}$). So, taking this rather trivial integral, we can see that

$$ A \geq 2\pi \lim _{t \rightarrow \infty} \int _1 ^t \frac{dx}{x} \implies A \geq \lim _{t \rightarrow \infty} 2\pi \ln(t) $$

Wait a minute; it's divergent! So we know the volume $V = \pi$, but the surface area $A \geq \infty$. This is no mistake, the math is valid. And that is simply wonderful.

A horn you can fill with paint, but you can't paint the surface.

Evolutionary Algorithms: An Evolutionary Approach To Problem Solving

Arguable the first (and most successful) problem solver we know of is Evolution. Humans (along with other species) all share a common problem: becoming the best at surviving our environment.

Just as Darwinian finches evolved their beaks for to survive different parts of the Galápagos Islands, we too, evolved to survive different parts of the world. And we can program a computer to do the same.

Evolution inspired a whole generation of problem solving, commonly known as Evolutionary Algorithms (EAs). EAs have been known for solving (or, approximating) solutions to borderline unsolvable problems. And, just as the mechanics of evolution are not that difficult, the mechanics of EAs are just the same.

Today, we will build an Evolutionary Algorithm from the ground-up.

An Introduction

Before we proceed with implementation or an in-depth discussion, first we wish to tackle two questions: what is an Evolutionary Algorithm, what does an Evolutionary Algorithm look like, and what problems can Evolutionary Algorithms solve.

What Is An Evolutionary Algorithm?

An Evolutionary Algorithm is generic, population-based optimization algorithm that generates solution via biological operators. That is quite a mouthful, so let’s break it up.

Population-based. All Evolutionary Algorithms start by creating a population of random individuals. These individuals are just like an individual in nature: there is a genotype (the genes that make up an individual) and a phenotype (the result of the genotype interacting with the environment). In EAs, they would be defined as follows:

  • Genotype The representation of the solution.
  • Phenotype The solution itself.

Because it’s a little confusing to think of it this way, it’s often better to think about it in terms of a genotype space and a phenotype space.

  • Genotype Space The space of all possible combinations of genes.
  • Phenotype Space The space of all possible solutions.

Don’t worry if this doesn’t make sense, we’ll touch on it later.

Optimization Algorithm. Evolution is an optimization algorithm. Given an environment, it will try to optimize an individual for that environment with some fitness metric. Evolutionary Algorithms operate the same way.

Given an individual, it will try to optimize it. We do not use a literal environment, but still use a fitness metric. The fitness metric is simply a function that takes in the genotype of the individual, and outputs a value that is proportional to how good a solution is.

Because fitness metrics are proportional to how good a solution is, this implies a very important condition for our phenotype space: it’s a gradient.

Biological operators. Evolutionary Algorithms are inspired by biology and evolution. Just as biology has operators to generate new individuals, so do Evolutionary Algorithms. More on that later.

Generic. Evolutionary Algorithms are generic. When a framework has been introduced, it can be reused on an individual basis (provided it has the appropriate crossover and mutator operators).

What Does An Evolutionary Algorithm Look Like?

The pseudocode for an Evolutionary Algorithm is one we might expect evolution to have, generate a random population, generate offspring, and let survival of the fittest do its job. And so it does:

BEGIN
    INITIALISE population with random solutions

    WHILE ( TERMINATION CONDITION is satisfied ) DO
        SELECT parents
        RECOMBINE pairs of parents
        MUTATE the resulting offspring
        EVALUATE new candidates
        SELECT individuals for the next generation
    OD 
END

What Problems Can Evolutionary Algorithms Solve?

Evolutionary Algorithms can solve any problem that has a genotype that can fit within our framework:

  • The genotype can have a crossover operator.
  • The genotype can have a mutator operator.
  • The genotype can map to a definite fitness function.

Again, the fitness should be proportional to how good a solution is. If the fitness function $f(x)$ is bounded by $0 \leq f(x) \leq 100$, 0 should be the worst solution or no solution, and 100 should be the best solution (or vice versa, for inverted fitnesses).

Implementing An Evolutionary Algorithm

We will be implementing a special class of Evolutionary Algorithm, referred to as a (μ + λ)-Evolutionary Strategy. The name is not important, but μ and λ will be; we will come back to them shortly.

For the purposes of our discussion, we are going to consider a sample problem: a deciphering program. The premise of the problem is such.

  • There is a string of characters (without spaces) hidden away that, after set, is inaccessible.
  • There are two ways to retrieve data about the hidden message:
    1. Get the length of the string.
    2. Given a string, the problem will output how many characters match within the two strings.

The secret message would look as follows:

class SecretMessage:
    def __init__(self, message):
        """Initialize a Secret Message object.

        :message (str): The secret message to hide.
        """
        self.__message = message

    def letters_match(self, message):
        """Determine how many characters match the secret message.

        Note:
            The message length and the secret message length must be the same length (accessed via the length property).


        :message (str): The message to compare the secret message to.
        :returns (int): The number of characters matched.
        """
        return sum(self.__message[char] == message[char] for char in range(len(message)))

    @property
    def length(self):
        """Get the length of the secret message.

        :returns (int): The length of the secret message.
        """
        return len(self.__message)

Not too complicated.

An Individual

In Evolutionary Algorithms, an individual is simply a candidate solution. An individual has a genotype (the representation) and operators (Crossover, Mutation, and Fitness) that act on the genotype. We will discuss them more extensively below.

The Genotype

As aforementioned, a genotype is the representation of an individual. Just as DNA can for humans, knowing the genotype can give you all the information one might need to determine the characteristics of an individual.

Because a genotype must be acted upon a crossover and mutation operator, there are few common choices for genotypes:

  • Vectors[1]. A vector is common because crossover is trivial, take elements from the two genotypes to create a new individual. Mutation is also trivial, pick random elements in vector, and mutate them. Often, for a complex enough individual, a vector of bits is used[2].
  • Matrices. Same as a vector, but with multiple dimensions.
  • Float-Point or Real Numbers. This one is tricky, but commonly used. There are a plethora of ways to recombine two numbers: average of the two numbers, bit manipulation, binary encoding crossover. Same can be said for mutation: adding a random value to the number, bit manipulation with a random value, or bit flipping in binary encoding. It should be noted that some of these introduce biases, and one should account for them.
  • Trees. Some problems can be easily broken down into trees (like an entire programming language can be broken down into a parse tree). Crossover is trivial, swap a random subtree with another. Mutation, however, is often not used; this is because the crossover itself acts as a mutation operator.

Next, our genotype must be initialized to some random values. Our initial population is seeded with said randomly-generated individuals, and with a good distribution, they will cover a large portion of the genotype space.

Keeping all this in mind, let us think about the representation of our problem. A string is nothing more than a vector of characters, so using the first bullet point, we are given our operators pretty easily.

Here’s what our genotype would look like:

class Individual:
    message = SecretMessage("")

    def __init__(self):
        """Initialize an Individual object.

        Note:
            Individual.message should be initialized first.
        """
        length = Individual.message.length
        characters = [choice(ascii_letters) for _ in range(length)]
        self.genotype = "".join(characters)

Crossover, Mutation, and Fitness

As aforementioned, to fit within an Evolutionary Algorithm framework, a genotype must be created with crossover, mutator, and fitness operators. Although we have covered said operators, we will formalize them here.

  • Crossover. A crossover operator simply takes in two genotypes, and produces a genotype that is a mixture of the two. The crossover can be uniform (random elements from both genotypes are taken), 1-point (take a pivot position between two points, the left half is one genotype, and the right half another), and $N$-point (same as 1-point but with multiple pivot positions).
  • Mutator. A mutator operator takes random values within the genotype and changes them to a random values. There is a mutation rate that is associated with all genotypes, we call it $p$. $p$ is bounded such that $0 \leq p \leq 1.0$, where $p$ is the percentage of the genotype that gets mutated. Careful to limit this value, however; too high $p$ can result in just a random search.
  • Fitness. The fitness operator simply takes in a genotype, and outputs a numerical value proportional to how good a candidate solution is. Fitness has no limits, and can be inverted (i.e., a smaller fitness is better).

For the purposes of our program, we are going to have the following operators: crossover will be uniform, mutation will be a fixed number of mutating characters, and fitness will be a percentage of the characters matched.

class Individual
    ...
    @property
    def fitness(self):
        """Get the fitness of an individual. This is done via a percentage of how many characters
        in the genotype match the actual message.

        :return (float): The fitness of an individual.
        """
        return 100 * Individual.message.letters_match(self.genotype) / Individual.message.length

    @staticmethod
    def mutate(individual, rate):
        """Mutation operator --- mutate an individual with a specified rate.
        This is done via a uniform random mutation, by selecting random genes and swapping them.

        Note:
            rate should be a floating point number (0.0 < rate < 1.0).

        :individual (Individual): The individual to mutate.
        :rate (float): The rate at which to mutate the individual's genotype.
        """
        number_of_characters_to_mutate = int(rate * individual.message.length)
        genotype_list = list(individual.genotype)  # Strings are immutable, we have to use a list

        for _ in range(number_of_characters_to_mutate):
            character_to_mutate = choice(range(individual.message.length))
            genotype_list[character_to_mutate] = choice(ascii_letters)

        individual.genotype = "".join(genotype_list)

    @staticmethod
    def recombine(parent_one, parent_two):
        """Recombination operator --- combine two individuals to generate an offspring.

        :parent_one (Individual): The first parent.
        :parent_two (Individual): The second parent.
        :returns (Individual): The combination of the two parents (the offspring).
        """
        new_genotype = ""

        for gene_one, gene_two in zip(parent_one.genotype, parent_two.genotype):
            gene = choice([gene_one, gene_two])
            new_genotype += gene

        individual = Individual()
        individual.genotype = new_genotype
        return individual

A Population

Now that we have an Individual, we must create a Population. The Population holds the candidate solutions, creates new offspring, and determine which are to propagate into further generations.

μ And λ

Remember when we mentioned that μ and λ would be important in our Evolutionary Algorithm? Well, now here they come into play. μ And λ are defined as follows:

  • μ: The population size.
  • λ: The number of offspring to create.

Although these are simple constants, they can have a drastic impact on an Evolutionary Algorithm. For example, a Population size of 1,000 might find a solution in much fewer generations than 100, but will take longer to process. It has been experimentally shown that a good proportion between the two is:

$$
λ / μ \approx 6
$$

However, this is tested for a large class of problems, and a particular Evolutionary Algorithm could benefit from having different proportions.

For our purposes, we will pick $μ = 100$ and $λ = 15$, a proportion just a little over 6.

class Population:
    def __init__(self, μ, λ):
        """Initialize a population of individuals.

        :μ (int): The population size.
        :λ (int): The offspring size.
        """
        self.μ, self.λ = μ, λ

        self.individuals = [Individual() for _ in range(self.μ)]

Generating Offspring

Generating offspring is trivial with the framework we imposed on an Individual: pick two random parents, perform a crossover between the two to create a child, mutate said child, and introduce the child back into the population pool.

In code, it would look as follows:

class Population:
    ...

    @staticmethod
    def random_parents(population):
        """Get two random parents from a population.

        :return (Individual, Individual): Two random parents.

        """
        split = choice(range(1, len(population.individuals)))
        return choice(population.individuals[:split]), choice(population.individuals[split:])

    @staticmethod
    def generate_offspring(population):
        """Generate offspring from a Population by picking two random parents, recombining them,
        mutating the child, and adding it to the offspring. The number of offspring is determine by
        λ.

        :population (Population): The population to generate the offspring from.
        :returns (Population): The offspring (of size λ).
        """
        offspring = Population(population.μ, population.λ)
        offspring.individual = []

        for _ in range(population.λ):
            parent_one, parent_two = Population.random_parents(population)

            child = Individual.recombine(parent_one, parent_two)
            child.mutate(child, 0.15)

            offspring.individuals += [child]

        return offspring

Survivor Selection

The last core part of an Evolutionary Algorithm would be survival selection. This puts selective pressure on our candidate solutions, and what ultimately leads to fitter solutions.

Survivor selection picks μ Individuals that would be the best to propagate into the next generation; however, it’s not as easy as picking the fittest μ Individuals. Always picking the μ best Individuals leads to premature convergence, a way of saying we “got a good solution, but not the best solution”. The Evolutionary Algorithm simply did not explore the search space enough to find other, fitter solutions.

There are a number of ways to run a survival selection, one of the most popular being $k$-tournament selection. $k$-tournament selection picks $k$ random Individuals from the pool, and selects the fittest Individual from the tournament. It does this μ times, to get the full, new Population. The higher $k$, the higher the selective pressure; however, also the higher chance of premature convergence. The lower $k$, the less of a chance of premature convergence, but also the more the Evolutionary Algorithm starts just randomly searching.

At the bounds, $k = 1$ will always be just a random search, and $k = μ$ will always be choosing the best μ individuals.

We choose $k = 25$, giving less fit solutions a chance to win, but still focusing on the more fit solutions.

    @staticmethod
    def survival_selection(population):
        """Determine from the population what individuals should not be killed. This is done via
        k-tournament selection: generate a tournament of k random individuals, pick the fittest
        individuals, add it to the survivors, and remove it from the original population.

        Note:
            The population should be of size μ + λ. The resultant population will be of size μ.

        :population (Population): The population to run survival selection on. Must be of size μ + λ.
        :returns (Population): The resultant population after killing off unfit individual. Will be
        of size μ.
        """
        new_population = Population(population.μ, population.λ)
        new_population.individuals = []

        individuals = deepcopy(population.individuals)

        for _ in range(population.μ):
            tournament = sample(individuals, 25)
            victor = max(tournament, key=lambda individual: individual.fitness)

            new_population.individuals += [victor]
            individuals.remove(victor)

        return new_population

The Evolutionary Algorithm

As with the pseudocode in the introduction, this will exactly resemble our Evolutionary Algorithm. Because the Individual and the Population framework is established, it is almost a copy-paste.

class EA:
    def __init__(self, μ, λ):
        """Initialize an EA.

        :μ (int): The population size.
        :λ (int): The offspring size.
        """
        self.μ, self.λ = μ, λ
        
    def search(self):
        """Run the genetic algorithm until the fittness reaches 100%.

        :returns: The fittest individual.
        """
        generation = 1
        self.population = Population(self.μ, self.λ)

        while self.population.fittest.fitness < 100.0:
            offspring = Population.generate_offspring(self.population)
            self.population.individuals += offspring.individuals
            self.population = Population.survival_selection(self.population)

            generation += 1

Running An Evolutionary Algorithm

Below, we have an instance of the evolutionary algorithm searching for a solution:

Now, looking at the string “FreneticArray”, it has 13 characters, and seeing as there are 26 letters in the alphabet, double that for lowercase/uppercase letters, our search space was:

$$
\left(2 * 26\right)^{13} \approx 2.0 \times 10^{22}
$$

Huge.

On average, our EA 29 generations to finish[3]. As each generations had at most 115 individuals, we can conclude on average we had to generate:

$$
29 * 115 = 3335 \ \text{solutions}
$$

Much smaller than $2.0 \times 10^{22}$. That is what Evolutionary Algorithms are good for: turning a large search space into a much smaller one.

Although there are much more advanced topics in Evolutionary Algorithms, this is enough start implementing your own. With just the simple operators listed above, a genotype, a search space that has a gradient, many problems can solved with an Evolutionary Algorithm.

The Source Code

All source code can be found here.


  1. A mathematical vector, common to linear algebra. Just a collection of related items, often referred to as an array in computer science. ↩︎

  2. Hey, if it’s powerful enough to run modern computers, surely it can be adequate enough for a genotype representation. ↩︎

  3. Per 1,000 runs. ↩︎

Computer Science Advice From A Computer Science Graduate

Reminiscing on my academic career, there are certainly things I did right; however, there are certainly things I did wrong. Aside from the rights and the wrongs, there are things I had wished I had done sooner. Today, I hope to shed some light on some of the these things.

For context, as of a month ago I am an alumni of Missouri University of Science and Technology, and engineering college in Missouri. It has equipped me very well in mathematics[1], physics[2], and computer science[3]. I wrote roughly 159,141 lines of code in college, so I feel as if I got my money’s worth. Thanks to Missouri S&T, I have a fairly diverse background in many things.

Below is advice I would recommend to anyone currently studying computer science (or programming in general).

Get A Computer You’ll Love

I got a 13-inch MacBook Pro my freshmen year of college, and it was the best decision I made in my academic career. At the beginning of my senior year, I got a 15-inch MacBook Pro with an LG UltraFine 5K Display, it was the second best decision I made in my academic career.

Getting a great computer is not just to speed up compile times; it’s to make programming more enjoyable. During a typical programming day, I’ll be on my computer 8-12 hours. 33%-50% of my my time that day will be on a computer. Reducing friction 33%-50% of the day is almost invaluable.

Although a MacBook was perfect for me, it might not necessarily be right for anyone else. Find a computer you love, you won’t regret it.

Learn LaTeX

LaTeX was probably my most used programming tool in my academic career. It was a Swiss Army knife for any form of documents I might need:

  • Notes
  • Presentations
  • Assignments
  • Papers
  • Documentation

Some samples of my LaTeX work can be found under my project page.

Learn A Different Language

At Missouri S&T, C++ is the first (and during my time there, the only) language one learned. The problem with this? One language, whether it be the can-all that C++ is, is not always appropriate for the job. There were times where Python was a much better choice. Sometimes it was C. Sometimes it was Swift. It all depends on the project. My recommendation is if you learned Python, learn C++. If you learned C++, learn Python. From there, find languages that will be useful for your domain.

Learn Your Text Editor

I didn’t pick up Vim until I was a junior; this was a giant mistake on my part. Vim didn’t just make it easier to delete a particular line of code; Vim made it easier to open files quickly, explore codebases faster, modify text quicker, etc.

Vim is not for everyone. Vim might not be right for you. Find a good text editor, and learn everything you can about it. It’ll make the friction of actually writing code much less.

Take Higher-Level Classes Early

Nearly all Computer Science programs require one to take higher-level classes as electives; in our school, it was roughly five graduate level classes. I made the mistake of taking all five my last year.

Emphasis on mistake.

Take them early; chances are you won’t be bogged down with job finding, apartment hunting, or the likes. Taking 18 hours, with three very difficult classes, in one term is not something I would wish on anyone.

Find Interests In Computer Science

When you find topics that interest you in Computer Science, chances are you will like your major more. I found a particular interests in AI related courses, but it is not for everyone. Computer Science has a breadth of interesting topics; check to see what your school has to offer, like:

  • Security
  • AI
  • Data Mining and Big Data
  • Machine Learning
  • Databases
  • UI/UX Design

Find Interests Outside Computer Science

Burnout is a serious issue in the industry. It can be a serious issue inside school as well. Foster passions outside of the classroom. Maybe join a design team, fraternity, or a sports team. There will be plenty of time at work to learn new things.

In addition, take genuinely interesting elective (not the easy ones). I learned I had an interest in Physics, and I carry that with me everyday.

Intern As Much As Possible

Internships are not only a gentle introduction into the industry, but a great way to determine what job you will want outside of school. Internships are often joked as “A 12-week interview processes”, and this can be true. If you find a good company that you can do good work for, it’s a great way to secure a job upon graduation.

Have Fun

The most important lesson I learned in college was to have fun (in moderation). College is all about learning who you are, what you want out of life, how to interact with others, and how to simply have fun. Lessons you learn in college will get carried over after graduating. Why not use this time to make life-long friends, create great memories, and know who you really are?

In Closing

This guide is not comprehensive; but it’s something I wish I read when I was starting. There’s always room to grow, so figure out what works for you. Do what makes you happy.

If there’s anything I am missing, feel free to ask.


  1. Calculus Series, Differential Equations, Linear Algebra, and Statistics. ↩︎

  2. Classical Mechanics, Electromagnetism, Modern Physics. ↩︎

  3. Artificial Intelligence, Evolutionary Computing, Algorithm Design, Object Oriented Numerical Modeling. ↩︎