;
h
t
t
p
s
:
/
/
t
w
i
t
t
e
r
.
c
o
m
/
m
o
n
o
l
e
s
a
n

what about an animation based on your movements?

idea

interactive images on the web that react on person’s movements

What is it doing?

bad quality 3D David’s head repeats your head rotation and cries if you leave him alone

What technology is used there?

TensorFlow.js PoseNet model — detects body parts
p5.js — for a work with a canvas (PoseNet requires canvas) and drawing acid green tears
three.js — for a work with 3D (I could use just p5.js in this case, but I wanted to create more complex project in the future)
ml5.js — for an easy work with TensorFlow.js

LllI𝖎̞͇́̌̚iNnnKkkSss

Design on Figma #OpenDesign ✧ •  .   * What is Open Design?????
Code on Glitch
Tears animation code on p5.js editor

✧* Let’s create something cool and innovative with AI! *✧

Detect head rotation

step 1

Calculate distance between right ear and nose and distance between left ear and nose using Pythagorean theorem.
distance² = (ear.X - nose.X)² + (ear.Y - nose.Y)²

In code this formula is:

distanceLeftEarNose = Math.round(Math.sqrt(Math.pow(leftEar.x - nose.x, 2) + Math.pow(leftEar.y - nose.y, 2)));

distanceRightEarNose = Math.round(Math.sqrt(Math.pow(rightEar.x - nose.x, 2) + Math.pow(rightEar.y - nose.y, 2)));

The more left ear and nose distance the less distance between right eye and nose and vice versa. If a person turns head right the right distance decreases and the left distance increases. If a person turns head left the right increases and the left distance decreases. Distantions are almost equal if a person looks straight.

step 2

Add up two distances:

distanceBetweenTwoEars = distanceLeftEarNose + distanceRightEarNose;

And divide the sum by two:

distanceGeneralForEachEar = distanceBetweenTwoEars / 2;

step 3

The field of view is 180º, it starts from the left side where it equals 0º (this is why rotation percent is calculated with the distance between left ear and nose). The 3d model looks forward if rotation degree equals 90º.

Calculate rotation percent:

rotationPercent = (distanceLeftEarNose * 100) / distanceGeneralForEachEar;

how to detect if human left David alone ((

That’s quite a simple task. For this, score of right and left eyes are tracked and compared with min and max numbers. Min is 0.2, we are almost sure there is no face, so we can draw tears. Max is 0.5, we are almost sure there is a face.
Score is the probability of eyes being on the image. The maximum is 1.

if (rightEye.score ≺ 0.2 && leftEye.score ≺ 0.2) {
  isthereface = false;
}

if (rightEye.score ≻ 0.5 && leftEye.score ≻ 0.5) {
  isthereface = true;
}

Gates and s̎̚k̋ͭͤy̻̩ͦ̃̐ is my design

I was inspired by a random picture on the www internet, BONES and webpunk.

the picture:

BONES's video:

w̰͛eͧ̏ͫ͒b̩̩͆̔͊́p̺̓́̀ͦ̔̚u̹̜̳͗̆̃̓ͯ̚n̜̼̲͍̒͌̅̇̔̐̚k̥̤̙̬̮͎͔̊̍ͦͪ͑:

here are some drafts:

An art of the link͔͙̦̞͕̞̜̝

b̤͗̈́ùͤ̋g̜͑͊s͓̙ͭs̘̟ͧs͖̎ͨzzz

Like everything in this project, the link of my twitter has bugs too. It works quite interesting.

There are two functions:
1. the first one replaces the text on the link to zalgo, strange symbols, Russian letters and Japanese kanji on hover. It adds symbols from an array (with these symbols) piece by piece every 50 milliseconds.
2. the second function returns the link text when the cursor is moved away.

How and why is it buggy?

There is no bug if the animation on hover is completed (all symbols from the array are shown). Otherwise, if the cursor moved away until the first function is completed on hover, the second function is called. This function changes all written symbols into the link text, and all not shown symbols is being adding to the text of the link.

you can find link animation code on public/ui.js

text

The text is written in column like in Japanese writing system.

Initially, I wanted the link button to change color on hover. But then I found a group on VK with pictures and music. The look of the group inspired me to fill the link button with strange symbols in animation.

d𝖉̲̼̬̲̏𝖉̱dddrafts

Logo and name

I tried some fonts for the name header of the project:

I chose Grind and Death font, then I played with colors...

...and combination of languages. I used katakana letters there. Katakana sometimes is used for transcription of foreign-language words into Japanese. The word “mirror” will sound like “miroru” in transcription.
A little tip: マ is read like “mi”, ロ is read “ro”, ル is read “ru”.

Then I played with colors, added a yellow background:

Eventually, got this:

That's the end. Thanks for reading!

Check out my other experiments with AI on https://monolesan.com