made-in-akira/writings/mandarinlabs-iplex/chatlogs/chat 1 - concept

49 lines
3.7 KiB
Text
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

2022.106 - Lounge (Scope: R2)
# [SERVER: 2022.106 16:31] fofo@HQ: connected to #lounge:r2
ado@OD: ahh, it gets my mind racing just thinking about it
ado@OD: say, this actually sounds like it could be a cool project
ado@OD: swing by optics and design some time if you want to do some brainstorming uwu
easter@MR: :o
# [SERVER: 2022.106 16:33] ado@OD: disconnected from #lounge:r2
akira@BA: gosh
akira@BA: is anyone else tired of machine learning projects in the vein of gpt and stable diffusion
nils@MR: the technology, or the energy around it?
akira@BA: the tech. it feels like a dead end to me
akira@BA: it takes so much power to get to a state that's remotely useful, and even then, it sucks
nils@MR: training takes more effort than you'd think
nils@MR: curating results, getting good sample data, probing for its limits...and losing hours of work when you leave it on overnight and the tunguska event happens to your model
akira@BA: yeah, that, precisely
akira@BA: because the machine wouldn't understand what it's doing. at best, you might make it fear god but--
easter@MR: in my imo the bigger problem is how profit incentives affected the technology itself
easter@MR: after dall-e mini, i feel as though theres been a gradual fizzle out into rudely biased "beauty" filters, and even the big image "generators" are founded on training data they didn't get permission to use
easter@MR: neither of which have had much real world application. as do many tchnologies in their infancy, but everything status quo would want out of ai - replacing stock image libraries, displacing customer support workers - is... you know, very useless to anyone else?
akira@BA: isn't that why we have goat room?
easter@MR: goat room isn't omnipotent! you of all people should know that. you were there last week working on the satellite!
easter@MR: we can protest with it, make some big waves even - remember the elon scandal - but it's difficult to change public opinion
easter@MR: especially the consensus held by the kind of people who care for machine learning.
akira@BA: sorry,
easter@MR: it's fine!
akira@BA: i still believe there is one field in machine learning hasn't really touched that would actually be worth a damn for society
akira@BA: ml models cut such such a big corner in the way their output is actually made
akira@BA: photorealistic illustrations aren't considered for how they would occupy a 3d space. drawings and paintings aren't done with layers and brush strokes. text isn't considered for its semantic or practical value
# [SERVER: 2022.106 17:00] floyd@YM: connected to #lounge:r2
akira@BA: it just takes noise and applies weights the humans told it were good to put ~something~ there
akira@BA: but what if it did consider these things
nils@MR: yeah, that's endgame - a real "artificial intelligence"
nils@MR: the problem is that the amount of resources that would take are entirely out of reach, even for the likes of beige alert and macrodata refinement
akira@BA: what stops us from playing god
akira@BA: what stops us from creating a being equal parts mechanical and biological
akira@BA: and giving them awareness? and the ability to comprehend
akira@BA: like, have you seen how much the human brain can do on less power than it takes to change a log by bolb
nils@MR: the law, parental responsibility. i guess all things do come to an end, so to give them a body prone to eventual failure isn't like horrifically rude but
fofo@HQ: Also. The watchful eyes of Joseph
nils@MR: what
akira@BA: huh?? who's that?
fofo@HQ: hahahahah
fofo@HQ: kidding, sorry
fofo@HQ: i'll see myself out. good night everyone
@floyd@YM: Soooo
@floyd@YM: Anyone down to go out for like, sushi or something tomorrow?
akira@BA: oh hell yeah
easter@MR: haven't had it in a while, so sure