aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • Mobile
  • Programming

Making Jellyfish Move In Compose: Animating ImageVectors And Applying AGSL RenderEffects 🐠

  • aster.cloud
  • November 27, 2022
  • 10 minute read

I love following inspiring people on the internet and seeing what they make — one such person isĀ Cassie Codes, she makes incredible animations for the web. One of her inspiring examples is thisĀ cute animated Jellyfish.

Animated Jellyfish by Cassie Codes

After seeing this and obsessing over it for a while, I kept thinking to myself that this cute little creature needs to come to life in Compose too. So this blog post describes how I went about making this in Jetpack Compose, the final code can be foundĀ here. The techniques in here are not only relevant for jellyfish of course… any other fish will do too! Just kidding — this blog post will cover:


Partner with aster.cloud
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

  • Custom ImageVectors
  • Animating ImageVector Paths or Groups
  • Applying a distortion noise effect on a Composable with AGSL RenderEffect.

Let’s dive in! 🤿

Analyzing the SVG

To implement this jellyfish, we need to see what the SVG is made up of first — and try to replicate the different parts of it. The best way to figure out what an SVG is drawing, is to comment out various parts of it and see the visual result of what each section of the svg renders. To do this, you can either change it in the codepen linked above, or download and open an SVG in a text editor (it’s a text readable format).

So let’s take an overview look at this SVG:

<!-- 
    Jellyfish SVG, path data removed for brevity 
--> 
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 530.46 563.1">
  <defs>
  <filter id="turbulence" filterUnits="objectBoundingBox" x="0" y="0" width="100%" height="100%">
    <feTurbulence data-filterId="3" baseFrequency="0.02 0.03" result="turbulence" id="feturbulence" type="fractalNoise" numOctaves="1" seed="1"></feTurbulence>
    <feDisplacementMap id="displacement" xChannelSelector="R" yChannelSelector="G" in="SourceGraphic" in2="turbulence" scale="13" />
  </filter>    
  </defs>
  <g class="jellyfish" filter="url(#turbulence)">
    <path class="tentacle"/>
    <path class="tentacle"/>
    <path class="tentacle" />
    <path class="tentacle" />
    <path class="tentacle"/>
    <path class="tentacle"/>
    <path class="tentacle"/>
    <path class="tentacle"/>
    <path class="tentacle"/>
    <path class="face" />
    <path class="outerJelly"/>
    <path id="freckle" />
    <path id="freckle"/>
    <path id="freckle-4"/>
  </g>
  <g id="bubbles" fill="#fff">
    <path class="bubble"/>
    <path class="bubble"/>
    <path class="bubble" />
    <path class="bubble"/>
    <path class="bubble"/>
    <path class="bubble"/>
    <path class="bubble" />
  </g>
  <g class="jellyfish face">
    <path class="eye lefteye"  fill="#b4bebf" d=""/>
    <path class="eye righteye" fill="#b4bebf" d=""/>
    <path class="mouth" fill="#d3d3d3" opacity=".72"/>
  </g>
</svg>

The SVG consists of the following elements:

  1. Paths and Groups of paths that make up the SVG:
  • Tentacles
  • Face — blob and outer jelly
  • Eyes — the animate open and closed
  • Bubbles — animate randomly around the the jellyfish — the size and alpha animates

2. Overall the jelly fish body also hasĀ feTurbulenceĀ (noise) applied as aĀ feDisplacementMap, this gives the wobbly look to it.

Now that we understand what this SVG is made up of, let’s go about rendering the static version in Compose.

Creating custom ImageVector

Compose has a concept of anĀ ImageVector, where you can build up a vector programmatically — similar to SVG. For vectors/SVGs that you just want to render without changing, you can also load up a VectorDrawable using painterResource(R.drawable.vector_image). This will take care of converting it into an ImageVector that Compose will render.

Read More  Jetpack Compose: The Alpha Release!

Now you might be asking yourself — why not just import the jellyfish as an SVG into an xml file and load it up usingĀ painterResource(R.drawable.jelly_fish)?

That is a great question — and it is possible to load up the jellyfish in this way, removing the turbulence aspect of the SVG and the image will render with an XML loaded up (as explained in theĀ documentation here). But we want to do a bit more with the individual parts of the path, such as animating parts on click and applying a noise effect to the body, so we will build up ourĀ ImageVectorĀ programmatically.

In order to render this jellyfish in Compose, we can copy the path data (or the ā€œdā€ tag on the path) that make up the fish, for example, the first tentacle has the following path data:

M226.31 258.64c.77 8.68 2.71 16.48 1.55 25.15-.78 8.24-5 15.18-7.37 23-3.1 10.84-4.65 22.55 1.17 32.52 4.65 7.37 7.75 11.71 5.81 21.25-2.33 8.67-7.37 16.91-2.71 26 4.26 8.68 7.75 4.34 8.14-3 .39-12.14 0-24.28.77-36 .78-16.91-12-27.75-2.71-44.23 7-12.15 11.24-33 7.76-46.83z

If you areĀ new to paths / vectors / SVGs, the above might be a bit overwhelming. But don’t worry, these are just commands that specify mathematical instructions on how to draw something. For instance,Ā MĀ is an command to move the virtual cursor to a new position without drawing, andĀ LĀ is a command to draw a line to the specified position, there are a few other commands such as:

  • M, m: Move to
  • L, l, H, h, V, v: Line to
  • C, c, S, s: Cubic BĆ©zier curve to
  • Q, q, T, t:Ā Quadratic BĆ©zier curve to
  • A, a:Ā Elliptical arc curve to
  • Z, z — Close the path

The commands areĀ case sensitive, an uppercase letter indicates absolute coordinates in the viewport space, whereas lowercase letter indicates that the command is relative to the current position.

Now you are probably thinking — do I have to draw in my head and know all the positions and commands by hand? No — not at all. You can create a vector in most design programs — such as Figma or Inkscape, and export the result of your drawing to an SVG to get this information for yourself. Whew! šŸ˜…

To create the vector in Compose: we callĀ rememberVectorPainter, which creates anĀ ImageVector, and we create aĀ GroupĀ calledĀ jellyfish, then anotherĀ GroupĀ calledĀ tentaclesĀ and we place the firstĀ PathĀ inside it for the first tentacle. We also set aĀ RadialGradientĀ as the background for the whole jellyfish.

/* Copyright 2022 Google LLC.	
   SPDX-License-Identifier: Apache-2.0 */
object JellyFishPaths {
    val tentaclePath = PathParser().parsePathString(
        "M226.31 258.64c.77 8.68 2.71 16.48 1.55 25 .15-.78 8.24-5 15.18-7.37 23-3.1 10.84-4.65 22.55 1.17 32.52 4.65 7.37 7.75 11.71 5.81 21.25-2.33 8.67-7.37 16.91-2.71 26 4.26 8.68 7.75 4.34 8.14-3 .39-12.14 0-24.28.77-36 .78-16.91-12-27.75-2.71-44.23 7-12.15 11.24-33 7.76-46.83z"
    ).toNodes()
}

@Preview
@Composable
fun JellyfishAnimation() {
    val vectorPainter = rememberVectorPainter(
        defaultWidth = 530.46f.dp,
        defaultHeight = 563.1f.dp,
        viewportWidth = 530.46f,
        viewportHeight = 563.1f,
        autoMirror = true,
    ) { _, _ ->
        Group(
            name = "jellyfish"
        ) {
            Group("tentacles") {
                Path(
                    pathData = tentaclePath,
                    fill = SolidColor(Color.White),
                    fillAlpha = 0.49f
                )
            }
        }
    }

    Image(
        vectorPainter,
        contentDescription = "Jellyfish",
        modifier = Modifier
            .fillMaxSize()
            .background(largeRadialGradient)
    )
}

// create a custom gradient background that has a radius that is the size of the biggest dimension of the drawing area, this creates a better looking radial gradient in this case. 
val largeRadialGradient = object : ShaderBrush() {
    override fun createShader(size: Size): Shader {
        val biggerDimension = maxOf(size.height, size.width)
        return RadialGradientShader(
            colors = listOf(Color(0xFF2be4dc), Color(0xFF243484)),
            center = size.center,
            radius = biggerDimension / 2f,
            colorStops = listOf(0f, 0.95f)
        )
    }
}

And the result of the following is a small tentacle drawn on screen with a radial gradient background!

Read More  PyCon 2019 | Supporting Engineers with Mental Health Issues
First tentacle rendered

We repeat this process for all the elements of the SVG — taking the bits of the path from the SVG file and applying the color and alpha to the path that will be drawn, we also logically group the paths into the tentacles, the face, the bubbles etc:

/* Copyright 2022 Google LLC.	
   SPDX-License-Identifier: Apache-2.0 */
val vectorPainter = rememberVectorPainter(
        defaultWidth = 530.46f.dp,
        defaultHeight = 563.1f.dp,
        viewportHeight = 563.1f,
        viewportWidth = 530.46f,
        autoMirror = true,
    ) { _, _ ->
        Group(name = "jellyfish") {
            Group(name = "tentacles") {
                // tentacle paths shown above
            }

            Group(name = "body") {
                // inner and outer jelly paths
            }
            Group(name = "freckles") {
                // freckle paths

            }
            Group(name = "face") {
                // face paths
            }
        }
        Group(name = "bubbles") {
            // bubbles around the jellyfish
        }
    }

    Image(
        vectorPainter,
        contentDescription = "Jellyfish",
        modifier = Modifier
            .fillMaxSize()
            .background(largeRadialGradient)
    )

We now have our entire jellyfish rendering with the aboveĀ ImageVector:

Whole static Jellyfish rendering in Compose

Animating ImageVector Paths and Groups

We want to animate parts of this vector:

  • The jellyfish should move up and down slowly
  • The eyes should blink on click of the jellyfish
  • The jellyfish body should have a wobbly/noise effect applied to it.

Doing this with a XML file is difficult: it’s tricky to animate, we need to work with XML and we can’t apply other effects to parts inside the file without converting it to anĀ AnimatedVectorDrawable. For example, we want to introduce an interaction where the eyes blink on click of the jellyfish. We are able to get finer grain control of our elements and create custom animations programmatically.

So let’s see how we can animate individual bits of theĀ ImageVector.

Moving the jellyfish up and down

Looking at the codepen, we can see that the jellyfish is moving with a translation up and down (y translation). To do this in compose, we create an infinite transition and aĀ translationYĀ that’ll be animated over 3000 millis, we then set the group containing the jellyfish, and the face to have aĀ translationY, this will produce the up and down animation.

/* Copyright 2022 Google LLC.	
   SPDX-License-Identifier: Apache-2.0 */
val vectorPainter = rememberVectorPainter(
        defaultWidth = 530.46f.dp,
        defaultHeight = 563.1f.dp,
        viewportHeight = 563.1f,
        viewportWidth = 530.46f,
        autoMirror = true,
    ) { viewPortWidth, viewPortHeight ->
        val duration = 3000
        val transition = rememberInfiniteTransition()
        val translationY by transition.animateFloat(
            initialValue = 0f,
            targetValue = -30f,
            animationSpec = infiniteRepeatable(
                tween(duration, easing = EaseInOut),
                repeatMode = RepeatMode.Reverse
            )
        )
        Group(name = "jellyfish", translationY = translationY) {
            Group(name = "tentacles") {
            }
            Group(name = "body") {
            }
            Group(name = "freckles") {
            }
        }
        Group(name = "bubbles") {
           
        }
        Group(name = "face", translationY = translationY) {
        }
    }
Translation Up and down

Great — part of theĀ ImageVectorĀ is now animating up and down, you will notice that the bubbles remain in the same position.

Blinking Eyes šŸ‘ļø

Looking at the codepen, we can see that there is aĀ scaleYĀ andĀ opacityĀ animation on each of the eyes. Let’s create these two variables and apply the scale to theĀ GroupĀ and the alpha on theĀ Path. We will also only apply these on click of the jellyfish, to make this a more interactive animation.

We create twoĀ AnimatablesĀ which will hold the animation state, and a suspend function that we will call on click on the jellyfish — we animate these properties to scale and fade the eyes.

We now have a cute blinking animation on click — and our jellyfish is almost complete!

Blinking on click of ImageVector

Applying a distortion/noise effect šŸ“Æ

So we’ve got most of the things we want to have animated — the movement up and down, and the blinking. Let’s look at how the jellyfish’s body has that wobbly effect applied to it, the body and tentacles are moving with noise applied to them to give it a sense of movement on it.

Read More  Wear OS Tiles Material Library: Build Tiles, Fast.
Codepen: Jellyfish without noise vs with noise applied

Looking at the SVG and the animation code, we can see that it usesĀ feTurbulenceĀ to generate noise that is then applied to the SVG as aĀ feDisplacementMap.

 <filter id="turbulence" filterUnits="objectBoundingBox" x="0" y="0" width="100%" height="100%">
    <feTurbulence data-filterId="3" baseFrequency="0.02 0.03" result="turbulence" id="feturbulence" type="fractalNoise" numOctaves="1" seed="1"></feTurbulence>
    <feDisplacementMap id="displacement" xChannelSelector="R" yChannelSelector="G" in="SourceGraphic" in2="turbulence" scale="13" />
  </filter>    
  </defs>
  <g class="jellyfish" filter="url(#turbulence)">

Unfortunately these primitives are not supported in Android at the moment (see this open bug), but we do have other tools up our sleeve that can help with this.Ā feTurbulenceĀ is generating noise that is then used as a displacement map to move the SVG around.

We can useĀ AGSLĀ shaders to achieve this, it’s worth noting that this is only supported on Tiramisu and up (API 33+). First we need to create a shader that’ll act as a wobble, we won’t use noise at first — just a mapping function instead for simplicity.

The way the shaders work is that they act on individual pixels — we get a coordinate (fragCoord) and we are expected to produce a color result that’ll be rendered at that coordinate. Below is the initial shader we will use for transforming the composable:

In our case, the input that we will be using is our currently rendered pixels on screen. We get access to this via theĀ uniform shader contents;Ā variable that we will send as input. We take the input coord (fragCoord), and we apply some transformations on this coordinate — moving it with time and generally performing some math on it to move it around.

This produces a new coordinate, so instead of returning the exact color at theĀ fragCoordĀ position, we shift where we get the input pixel from. For example, if we hadĀ return contents.eval(fragCoord), it would produce no change — it would be a pass-through. We now get the pixel color from a different point of the composable — which will create a wobbly distortion effect on the content of the composable.

To use this on our composable, we can apply this shader as aĀ RenderEffectĀ to the contents of the composable:

/* Copyright 2022 Google LLC.	
   SPDX-License-Identifier: Apache-2.0 */
val time by produceState(0f) {
   while (true) {
       withInfiniteAnimationFrameMillis {
           value = it / 1000f
       }
   }
}

val shader = RuntimeShader(WOBBLE_SHADER)

Image(
   vectorPainter, contentDescription = "",
   modifier = Modifier
       .fillMaxSize()
       .background(largeRadialGradient)
       .onSizeChanged { size ->
           shader.setFloatUniform(
               "resolution",
               size.width.toFloat(),
               size.height.toFloat()
           )
       }
       .graphicsLayer {
           shader.setFloatUniform("time", time)
           renderEffect = android.graphics.RenderEffect
               .createRuntimeShaderEffect(
                   shader,
                   "contents"
               )
               .asComposeRenderEffect()
       }
)

We useĀ <a class="au le" href="https://developer.android.com/reference/android/graphics/RenderEffect#createRuntimeShaderEffect(android.graphics.RuntimeShader,%20java.lang.String)" target="_blank" rel="noopener ugc nofollow">createRuntimeShaderEffect</a>, passing in theĀ WOBBLE_SHADERĀ as input. This takes the current contents of the composable, and provides it as input into the shader, with the parameter name ā€œcontentsā€. We then query the contents inside theĀ WOBBLE_SHADER. TheĀ timeĀ variable changes the wobble over time (creating the animation).

Running this, we can see the wholeĀ ImageĀ is now distorted and looks a bit more wobbly — just like a jellyfish.

Wobble applied all over the whole jellyfish

If we wanted to not have the effect apply to the face and bubbles, we can extract those into separateĀ ImageVectors, and skip out on applying the render effect to those vectors:

Wobble Applied without affecting the face

Applying Noise Effect šŸ™‰

The shader we specified above isn’t using a noise function to apply a displacement to the content of the composable. Noise is a way to apply a displacement, with a more structured random function. One such type of noise is Perlin noise (which is whatĀ feTurbulenceĀ uses under the hood), this is what it would look like if we render the result of running the Perlin noise function:

Perlin Noise output

We use the noise value for each coordinate in the space, and use it to query a new coordinate in the ā€œcontentsā€ shader.

Let’s update our shader to use a Perlin noise function (adapted fromĀ this Github repo). We will then use it to determine the coordinate mapping from input coordinate to output coordinate (i.e. a displacement map).

Applying this noise function, we get a much better result! The jellyfish looks as if it is moving inside the water.

Perlin Noise applied to the jellyfish body

But why would I use this?

At this point you might be wondering, this is cool — but very niche in its use case, Rebecca. Sure — maybe you aren’t making an animated jellyfish every day at work (we can dream right?). ButĀ RenderEffectsĀ can be applied to any composable tree — allowing you to apply effects to just about anything you want.

For example, why wouldn’t you want your gradient text or whole composable screen to have a noise effect or any other AGSL effect your heart desires?

Perlin Noise applied to whole Composable

Wrap up 🌯

So we’ve covered many interesting concepts in this blog post — creating customĀ ImageVectorsĀ from SVGs, animating parts of anĀ ImageVectorĀ and applying AGSL shaders asĀ RenderEffectsĀ to our UI in Compose.

For the full code of the Jellyfish — check out the fullĀ gist here. For more information onĀ AGSL RenderEffects — check out the documentation, or theĀ JetLagged SampleĀ for another example usage of it.

If you have any questions — feel free to reach out on MastodonĀ androiddev.social/@riggarooĀ orĀ Twitter.

Thanks to Jolanda Verhoef, Nick Butcher, Florina Muntenescu, Romain Guy, Nader Jawad for the valuable feedback on this post.

By Rebecca Franks
Source Android


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

aster.cloud

Related Topics
  • AGSL
  • Android
  • ImageVectors
  • RenderEffects
  • SVG
You May Also Like
View Post
  • Architecture
  • Data
  • Engineering
  • People
  • Programming
  • Software Engineering
  • Technology
  • Work & Jobs

Predictions: Top 25 Careers Likely In High Demand In The Future

  • June 6, 2023
View Post
  • Programming
  • Software Engineering
  • Technology

Build a Python App to Alert You When Asteroids Are Close to Earth

  • May 22, 2023
View Post
  • Programming

Illuminating Interactions: Visual State In Jetpack Compose

  • May 20, 2023
View Post
  • Computing
  • Data
  • Programming
  • Software
  • Software Engineering

The Top 10 Data Interchange Or Data Exchange Format Used Today

  • May 11, 2023
View Post
  • Gears
  • Mobile
  • Technology

Apple Watch Pride Edition Celebrates The LGBTQ+ Community

  • May 10, 2023
View Post
  • Architecture
  • Programming
  • Public Cloud

From Receipts To Riches: Save Money W/ Google Cloud & Supermarket Bills – Part 1

  • May 8, 2023
View Post
  • Programming
  • Public Cloud

3 New Ways To Authorize Users To Your Private Workloads On Cloud Run

  • May 4, 2023
View Post
  • Programming
  • Public Cloud

Buffer HTTP Requests With Cloud Tasks

  • May 4, 2023

Stay Connected!
LATEST
  • college-of-cardinals-2025 1
    The Definitive Who’s Who of the 2025 Papal Conclave
    • May 7, 2025
  • conclave-poster-black-smoke 2
    The World Is Revalidating Itself
    • May 6, 2025
  • 3
    Conclave: How A New Pope Is Chosen
    • April 25, 2025
  • Getting things done makes her feel amazing 4
    Nurturing Minds in the Digital Revolution
    • April 25, 2025
  • 5
    AI is automating our jobs – but values need to change if we are to be liberated by it
    • April 17, 2025
  • 6
    Canonical Releases Ubuntu 25.04 Plucky Puffin
    • April 17, 2025
  • 7
    United States Army Enterprise Cloud Management Agency Expands its Oracle Defense Cloud Services
    • April 15, 2025
  • 8
    Tokyo Electron and IBM Renew Collaboration for Advanced Semiconductor Technology
    • April 2, 2025
  • 9
    IBM Accelerates Momentum in the as a Service Space with Growing Portfolio of Tools Simplifying Infrastructure Management
    • March 27, 2025
  • 10
    Tariffs, Trump, and Other Things That Start With T – They’re Not The Problem, It’s How We Use Them
    • March 25, 2025
about
Hello World!

We are aster.cloud. We’re created by programmers for programmers.

Our site aims to provide guides, programming tips, reviews, and interesting materials for tech people and those who want to learn in general.

We would like to hear from you.

If you have any feedback, enquiries, or sponsorship request, kindly reach out to us at:

[email protected]
Most Popular
  • 1
    IBM contributes key open-source projects to Linux Foundation to advance AI community participation
    • March 22, 2025
  • 2
    Co-op mode: New partners driving the future of gaming with AI
    • March 22, 2025
  • 3
    Mitsubishi Motors Canada Launches AI-Powered “Intelligent Companion” to Transform the 2025 Outlander Buying Experience
    • March 10, 2025
  • PiPiPi 4
    The Unexpected Pi-Fect Deals This March 14
    • March 13, 2025
  • Nintendo Switch Deals on Amazon 5
    10 Physical Nintendo Switch GameĀ Deals on MAR10 Day!
    • March 9, 2025
  • /
  • Technology
  • Tools
  • About
  • Contact Us

Input your search keywords and press Enter.