Ask a scientist: How will AI affect creativity?

June 06, 2023

Once policymakers provide the needed – and demanded – regulation on these matters, the latest generation of creative AI will likely become yet another tool in many professionals' creative work. How should we define creativity and how does machine creativity differ from human creativity? One way to differentiate human creativity from machine creativity is to think of it in terms of motivation. I believe that studying the functional and perceived disparities between human and AI is crucial in that it enables us to ask: how should artificial creativity be different from human creativity? We’re now at a juncture where, instead of asking "can AI be creative", we should be asking "what kind of creative AI is best for us".

Where do you see the development of creative AI models heading in terms of the creative industries?

Once policymakers provide the needed – and demanded – regulation on these matters, the latest generation of creative AI will likely become yet another tool in many professionals' creative work. Especially in applied art, they will likely yield considerable cost deductions and an increase in productivity. I expect it to be used beyond early phases of the creative process all the way to the final product. I also project that future generations of these systems will require even less involvement by artists, which is presently still very much needed. 

However, the jury's still out on whether this development can be considered a benefit for everyone: our latest study found that professionals assumed various kinds of roles for themselves when working with AI, from "art director for the AI" to "slave to the AI." Moreover, while substituting potentially otherwise unavailable skills and workforce, these systems might also increase our dependency on technology and those who provide it – a development that we should be very conscious of. 

The take-home message is that the impact of creative AI on professionals is not only positive; the situation is rapidly changing, and the diverse reactions prohibit a one-size-fits-all solution as of now. This puts industry leads, researchers and policymakers into a tricky position. Also, as teachers at Aalto, we must watch these developments closely to equip our students with skills that will complement their traditional skills in a future-proofed way.

How can we make the adoption of generative AI socially and ethically sustainable?

I consider sustainability as one of the prime challenges for all of us in balancing the wellbeing of those affected by creative AI with business interests and scientific curiosity. More specifically at this point we see two pressing questions that put many professionals into inner conflict. First, are artists going to be credited and compensated for the data that are used in the models training, and how? Second, a major issue for professionals is who owns the copyright of the outputs. I argue that these issues must be resolved first through quick and transparent legislation to support the ethical and sustainable use of these systems. 

In addition to these questions, we are left with a whole range of issues that are still in flux. For instance, what do professionals find most meaningful about their work, and consequently, which aspects should AI rather not touch? To this end, professional creatives must be involved in the regulation and development of creative AI. Discussions on social media and the news can be very noisy and too superficial for e.g. policymaking. Through scientific studies, we can give professionals a clearer voice. Doing this in a longitudinal fashion should allow us to track how uses and perceptions change and adapt appropriately. Complementing such user studies, we must also become capable of experimenting with changes to the systems themselves, rather than taking what industry has to offer. We are now at a point where these types of models have become flexible enough to be trained and investigated at Aalto, an opportunity which my colleagues and I now actively pursue. 

How should we define creativity and how does machine creativity differ from human creativity?

We can conceive creativity as the production of novel, as well as valuable artefacts – for instance in terms of usefulness or aesthetic pleasure. But this is only one way to see it, and cognitive scientists still struggle with defining creativity. In fact, the concept’s meaning is constantly re-negotiated by society. Most notably, we have observed a shift in emphasis from craft within the creative process, to the ideas that go into it. This volatility makes research on creative AI a challenging endeavour and requires us to look beyond AI and human-computer interaction into cognitive science, philosophy, the social sciences, and other disciplines. 

One way to differentiate human creativity from machine creativity is to think of it in terms of motivation. For instance, much of human creativity is driven by intrinsic motivation such as curiosity. Here, we act not for any value outside of the activity itself. This is fundamentally different from most creative AI, which is built to optimise a separate goal, such as producing outputs that people find most appealing, by including features of the data that the system was trained on. However, I believe that this not only limits an AI’s creative potential, but also the extent to which it could really complement and augment, rather than just substitute, human creativity. My research challenges this divide.

I believe that studying the functional and perceived disparities between human and AI is crucial in that it enables us to ask: how should artificial creativity be different from human creativity? And what biases are at work when we interact with creative AI, that keep us from using it in a more fulfilling way? We’re now at a juncture where, instead of asking "can AI be creative", we should be asking "what kind of creative AI is best for us".

 

You can follow Christian's work on Aalto's webpage, Mastodon and Twitter.

The source of this news is from Aalto University

Popular in Research

1

Apr 6, 2024

Conspiracy theory runs wild linking New York City’s 4.8-magnitude earthquake to date of solar eclipse

2

Apr 4, 2024

Five Sydney researchers honoured by Australian Academy of Science

3

Apr 9, 2024

The rise of Dawn

4

Apr 1, 2024

VIU researcher exploring tourism resilience in rural BC communities

5

Apr 4, 2024

Conservative activist Charlie Kirk helped oust Ronna McDaniel at the RNC. Now the knives are out for him.

Fight or flight: Fearful Trump critics weigh the risk of retribution if he's re-elected

2 days ago

MSN

2 days ago

Silence broken on gender pay gaps but we must hold organisations to account

2 days ago

Nasdaq Futures Up 2% as Nvidia Powers Global Rally: Markets Wrap

Apr 8, 2024

Investigating and preserving Quechua

1 day ago

Biden visits his Pennsylvania hometown to call for more taxes on the rich and cast Trump as elitist

1 day ago