It was National Coding Week in the UK last week, so what better time to reflect on the importance of programming in my life, and also to sound out my views on the latest issue - AI art.
My First Code
My first introduction to the idea of code was learning Basic on my Amiga 500 (what a classic machine), where myself and my sister would get books from the library and, over hours, code a game from the book. It would require some debugging to work on the Amiga, and the eventual output was not graphically engaging, but it was a game.
Following on, at high school in IT classes, we had some more coding lessons, but this typically involved designing a series of steps for a turtle (used for CAD) to draw some shape on the screen. But as a maths and science nerd (and also one who saw coding as cool - thanks, Hackers), I was still enamoured by the idea of designing programs on computers.
It would not be until the last two years of my degree that I would get into programming properly. What drew me in were the simulation exercises we would perform in earlier years of the degree, simulating the motion of atoms and molecules. The idea that we could simulate these interactions, and design programs to performs these ever more accurately, was exciting, especially for someone who was not the best experimentalist in the lab.
Fortran
Fortran would be my first true coding language, as the final year project of my 4th year of my masters degree would involve modifying and improving an existing piece of simulation software, all written in Fortran. I was not designing from scratch, but I still needed to understand the program well enough to expand and correct issues within it. The program enabled new water simulations, using advanced representations of electrostatics derived from quantum mechanics. Programming, for me, was the union of maths, physics, and chemistry.
My degree led to a PhD with the same supervisor, and expanded on my degree project, but introduced the new element of machine learning. Now, rather than just programming simulations, I was designing neural networks, and designing the data sets that were used to train them.
Fortran would become a mainstay of my programming for years to come, expanded to include parallel programming, and other languages such as SVL. However, it was the jump from postdoc to Research Software Engineer that prompted me learning even more programming languages and discovering my love of Python.
Python
Python didn't take too long to learn, which is perhaps the best thing about it - it is easy to pick up and teach. Initially, I was learning python to write short scripts to manage data and tasks on high-performance computers, but before long, I was designing more elaborate programs, and would become the design lead on a project, leading postdocs and postgrads, to create a python driven webapp - an electronic lab notebook. This required a further expansion of skills to become a full stack developer - using databases, python, and HTML.
It's more recently since joining Leighton, and putting into practice more Python and my experience as a team leader and scientist that my joy of Python has increased further, as I toy around with frameworks that allow me to focus on that task rather than the means to perform the task.
Machine Learning
A core aspect of my work has been machine learning. Whether than has been the use of neural networks, Gaussian process regression, genetic algorithms, or similar. I've used it to simulate water, proteins, search for optimal structures of crystals, or predict the response of materials to heat. Machine learning - not AI - is all about feeding data to some algorithm, and allowing that algorithm to extrapolate and interpolate, and so create solutions based upon the source data. Such machine learning is a hot topic since "AI art" is all the rage - or inducing rage.
Let's stop for a moment and think about those words, extrapolate, and interpolate. Machine learning can use the data is provided - especially if it is labelled appropriately so that data points can be generated "between" those provided. It is this predictive ability of machine learning that is now being leveraged for art.
As a tool, this provides artists with the ability to fill entire worlds with art. A perfect case example would be a open-world computer game. Artists for the game can provide a library of base assets to train the algorithm, and then when required, and given parameters, the machine learning program can generate new assets based upon the inputs provided and the reference training library.
Where things become questionable ethically is when these machine learning algorithms are trained on art that is not open source or directly provided by the owners. The goodwill of artists labelling their work to make it easily searchable has enabled the scraping of the internet for art to feed to these new machine learning algorithms. Just as we have data protection acts to protect our personal data and health data, which could all be used my machine learning and impact our credit scores or health insurance, there needs to be a similar protection for the work of artists. Just because their art is available to view on the internet, it does not mean they have given permission for their art - and their ability and creativity - to be commodified on an industrial scale.
The proper compensation must be provided to the artists whose art is being used to train these algorithms, just as scientists must provide references and citations, and pay for licensed software, to perform their novel research. It is certainly not the appropriate response to tell the aggrieved artists to "go get a new job". And we should always remember that a machine learning algorithm is only as good as the data provided. The bias in what they render is something that humans are more able to overcome, and currently, compared to an artist, AI is incapable of making incremental, by design, changes to art, similar to the process in art design.
Comments