( Learning0 ) model git : ( main ) ls process



researching
designing
redesigning
scraping

downloading
installing
uninstalling
resizing
sorting
training
fine-tuning
analyzing
retraining
comparing
more retraining
optimizing
testing
validating
upgrading
flashing
initializing
connecting
transferring
more testing
modelling
printing
wiring
troubleshooting
debugging
converting
assembling
testing
more designing
decorating
deploying

 






CASSI (2025)

3D printed PVA, synthetic hair, cardboard, scar tape, Jetsen Nano with Webcam, Arduino, Adafruit NeoPixel ring

Data from thispersondoesnotexist.com and Kaustubh Dhote’s face dataset

Written in Python, Javascript and running with TensorFlow, Keras, Pillow, Pytorch, and Pyserial 

GITHUB



CASSI is a Companion for Anti Social-Sexual Individuals

Our internet activities jump between the lines of public and private constantly. Obsessed with privacy yet willing to sell all our secrets, crimes, hopes and desires for free WiFi.

Love and sex in the last few decades have become hopelessly intertwined with our online identities, so much so that these online identities are no longer online- they are us. Love no longer has separation from the performance, enhancement and sensational nature of internet media.

Although conversations surrounding AI and tech with humans feels very tired, it is subconsciously entering our lives in undetectable ways. Pornography, dating apps, algorithms, sex robots, facetune, even profile pictures, and movies. Our associations with a human image, especially a woman’s image, on a screen have become inherently sexual. Tinder and softly sexualized and racialized media has created an internet where the face of a woman simply existing on a screen, especially in our private environments and personal screens, are subject to sexual consumption. Digital sexual consumption has not even hesitated to extend to fake people, animations, animals, and inanimate objects.

Pornography and sexual attraction as a whole in our modern day are undeniably tied to race and gender. Tech as a whole is as well. How would you react if these biases, desires and judgements were brought into the real world for all to see? If images of your friends, grandparents, and favourite celebrities were subject to the judgement that we all subconsciously project onto screen faces all day, everyday? Without your discretion, context, and hesitation?

CASSI is a robot. She is a beautiful woman, made of PVA, circuits, synthetic hair and wires.

CASSI runs on a neural network, a fine-tuned version of the ResNet50 CNN, trained with 2000-4000 images of synthetic faces as well as celebrity face datasets, organized into the painfully strict categories of “yes” and “no”, the question of attractiveness.

The model is able to identify faces I would deem attractive to an 80-90% accuracy. With a built in camera, live video uses OpenCV to detect faces and send them to the model for judgement. Then, serial data is sent to an arduino, attached to a circuit that lights the robot’s face to indicate yes or no.

Robot puts all of my own biases on blast. Training an ML algorithm is like dealing with an avoidant partner. It won’t ever tell you exactly what is wrong, but it will tell you yes or no to a million specific questions and hope that you figure out the issue yourself. The algorithm forces me to deal with any personal dysmorphia I may have- it knows which side of my face is my good side and it will not lie to make me feel better.

Where is the line at which we stop projecting empathy, personality and humanness unto computers and technology?

Jeong Geum-Hyung creates flailing and helpless robot parts that make us question this inclination to personify or empathise with robots. Jeong’s works explore the relationships between humans and technology, human and non-human, and the way capitalism aggravates and muddles these relationships. Her pieces reflect “a comfort with technology that is uncomfortable, a sensitivity towards objects that is non-consensual, a beautiful and horrifying revelation of techno-social opposition and similitude,” (Achenbach, 2022).
When we sexualize robots, do we remove empathy, or when we refrain from sexualizing them, do we begin to practice empathy, or do they exist at the same time?

Is the image of a person on a screen no longer human, subject to sexual projection? Is an AI image, or a hentai, a facetuned person, 26 a person deserving of empathy, or personification? Are these people, images and things, raced and gendered? Because we certainly project those ideals upon them regardless.


Lines are blurry and robot doesn’t have answers, she simply knows HOT or NOT!