Sort:  

We cannot know where that will end up because cultural is such a feedback loop. I have been toying with a thought experiment that uses cultural interaction between AIs to negotiate aesthetic tastes and then use that to create original works. If the idea is sound then I'll have a go at building it.

We might be closer to that than you think. Neural nets and GANs can create content and artwork. If you trained a set of bots to create art, and managed to teach it what influence means, then you set them off, constantly riffing off eachother, they would likely create their own artwork styles. That's part of culture.
Of course, we're probably still a ways off from AI as a whole having their own culture encompassing multiple things, but I think it's likely to happen in multiple ways in the future.

I toying with making something like this part of my research program in the next year or two. Though the secret sauce is how we help the AIs understand when they are having an aesthetic experience. Otherwise, we're basically just putting randomness into feedback loops to see what sticks. Whether or not you think that randomness gets you anywhere interesting I suppose depends on how much of a platonist you are.
I did write an interactive fiction where some AIs decided it was their religious duty to search for true randomness to see whether or not it existed.

That sounds cool! Is it up on the web somewhere?

No. It's an embryonic thought experiment for now. I'm going to run it past some AI researchers before I try to implement it.

No, I meant the interactive fiction.
The AI is cool as well though. I've been mentally wrestling with how one might simulate emotional responses in the last few days myself. Not for the usual fictional reason, but to help learning and creation. In science fiction they often have some android struggling with emotion, but I wonder if it might be far more important than we realize, and may need to be solved sometime soon. Not so we can have a computer cry because puppies, but so it can subjectively decide.

If you're interested, the interactive fiction The Entropy Cage is available online and for Android.
There's some theory that emotion is quick thinking and conscious thought is slow thinking. I'm not sure how I feel about that. The trick for building such AIs is coming up with a model that can be converted into code.

Perhaps it's simpler than I'm thinking. Perhaps it's a side effect of the complex reward system of the human brain, and something similar will come about in all significantly advanced general AI.