How to completely change the default terminal to Blackbox on Ubuntu 23.10

Ubuntu 23.10 uses Gnome Shell as its default, and it makes some of the terminal settings unintuitive to change.

This post will show how to override settings for Ubuntu to your preferred terminal for a cohesive experience.

In this example we will use blackbox-terminal.

First, let’s install Blackbox.

sudo apt install blackbox-terminal

Or

flatpak install flathub com.raggesilver.BlackBox

The first setting we’ll change is the default Terminal shortcut, Ctrl + Alt + T

Visit the settings, and navigate to Keyboard Shortcuts.

Click “View and Customize Shortcuts”, then scroll to “Custom Shortcuts”.

Name: blackbox | Command: /usr/bin/blackbox-terminal | Shortcut: Ctrl + Alt + T

Replace the other shortcut when prompted.


Run the below command to select an alternate default terminal.

sudo update-alternatives --config x-terminal-emulator

Choose the number that lists your terminal, in this case, Blackbox.


Gnome hardcodes some cases to use gnome-terminal. The easiest way to modify this behavior is by symlinking the blackbox terminal binary to where it expects gnome-terminal.

Backup the real one.

mv /bin/gnome-terminal /bin/gnome-terminal.bak
mv /bin/gnome-terminal.real /bin/gnome-terminal.real.bak

Then link your desired terminal instead.

ln -s /bin/blackbox-terminal /bin/gnome-terminal
ln -s /bin/blackbox-terminal /bin/gnome-terminal.real

We will also change our right click menu options to use Blackbox.

Let’s uninstall the package responsible for adding the Terminal entry to our right click menu in nautilus. We will replace it with our own functionality.

sudo apt remove nautilus-extension-gnome-terminal

Next, verify the package python3-nautilus is installed. This is an up to date version of python-nautilus.

sudo apt install python3-nautilus

The package will allow us to write custom Python extensions for the file explorer.

Navigate to ‘~/.local/share/nautilus-python/extensions‘ (You may have to create it)

Create a file named ‘anythingyouwant.py‘ (ex: open-terminal.py), and add the below python.

import os
from urllib.parse import unquote
from gi.repository import Nautilus, GObject
from typing import List
import subprocess

class OpenTerminalExtension(GObject.GObject, Nautilus.MenuProvider):
    def _open_terminal(self, file: Nautilus.FileInfo) -> None:
        filename = unquote(file.get_uri()[7:])
        subprocess.Popen(["blackbox-terminal", "--working-directory=" + filename])

    def menu_activate_cb(
        self,
        menu: Nautilus.MenuItem,
        file: Nautilus.FileInfo,
    ) -> None:
        self._open_terminal(file)

    def menu_background_activate_cb(
        self,
        menu: Nautilus.MenuItem,
        file: Nautilus.FileInfo,
    ) -> None:
        self._open_terminal(file)

    def get_file_items(
        self,
        files: List[Nautilus.FileInfo],
    ) -> List[Nautilus.MenuItem]:
        if len(files) != 1:
            return []

        file = files[0]
        if not file.is_directory() or file.get_uri_scheme() != "file":
            return []

        item = Nautilus.MenuItem(
            name="NautilusPython::openterminal_file_item",
            label="Open in Terminal",
            tip="Open Terminal In %s" % file.get_name(),
        )
        item.connect("activate", self.menu_activate_cb, file)

        return [
            item,
        ]

    def get_background_items(
        self,
        current_folder: Nautilus.FileInfo,
    ) -> List[Nautilus.MenuItem]:
        item = Nautilus.MenuItem(
            name="NautilusPython::openterminal_file_item2",
            label="Open in Terminal",
            tip="Open Terminal In %s" % current_folder.get_name(),
        )
        item.connect("activate", self.menu_background_activate_cb, current_folder)

        return [
            item,
        ]

You may even customize the label variable under both get_background_items and get_file_items. I usually prefer my entry to simply read “Terminal”.

To see any changes, run nautilus -q to restart the process.

This is great, but right clicking on the Desktop and choosing Open in Terminal there will, oddly, still open gnome-terminal. An extension controls this behavior, so we will have to edit that as well.

Open ‘/usr/share/gnome-shell/extensions/ding@rastersoft.com/app/desktopIconsUtil.js

Edit the ‘launchTerminal‘ function to instead spawn your preferred terminal.

function launchTerminal(workdir, command) {
    let argv = ['blackbox-terminal', `--working-directory=${workdir}`];
    if (command) {
        argv.push('-e');
        argv.push(command);
    }
    trySpawn(workdir, argv, null);
}

You can also edit the label in both ‘fileItemMenu.js:358‘ and ‘desktopManager.js:1082


Now regardless of how the terminal is launched, blackbox-terminal should be called instead!


It helps me if you share this post

Published 2024-05-21 06:00:00

Modern Pooling Principles in Unity C#

When developing software, performance is one of the most important facets, especially if targeting a platform like web/mobile.

Creating and Destroying objects requires a lot of memory and processing power relative to our other game actions, but we can reduce the impact of Instantiation in Unity by simply reusing them.

In Unity, we can do this by Instantiating all of the objects first, then storing references to them.

We will explore this concept in an example open source game I created ‘slashdot’, which also contains shaders from the last two posts.

https://github.com/gen3vra/slashdot

Setup

We will begin creating the class which will actually handle our pooled objects. When working with pooled GameObjects vs simply Instantiating and Destroying them, we want to be careful of a few key concepts. Firstly, we want to disable most properties for reuse later as opposed to destructing them. Rarely you will need to create and destroy components on initialization, but the vast majority of components or the GameObject itself can be disabled and enabled.

public GameObject enemyPrefab;
public Queue<Enemy> PooledEnemies;
public List<Enemy> TrackedActiveEnemies;

Assign an enemy through the inspector. Next we will create our pools.

Creating the Objects

Call the setup function in the Awake of the class to setup the pool.

void SetupPools()
{
    for (int i = 0; i < 100; i++)
    {
        var enemy = Instantiate(enemyPrefab, Vector3.zero, Quaternion.identity);
        PooledEnemies.Add(enemy.GetComponent<Enemy>());
        enemy.SetActive(false);
    }
}

This will Instantiate all of the objects and keep a reference for us.

Using the Objects

Now, when we want to use a GameObject we can simply call our function in our class from our instance to return a GameObject for us to manipulate.

A super simple implementation might look something like the below.

public GameObject GetEnemy()
{
    GameObject enemy = PooledEnemies.Dequeue();
    return enemy;
}

If only using the <Queue> type and planning for one enemy. However, we want to use multiple enemy types. We can make our pooled enemies a list to have more flexibility. An example implementation for this logic would be an EnemyType enum that the GetEnemy function checks, like so.

public List<Enemy> PooledEnemies = new List<Enemy>();
public GameObject GetEnemy(Enemy.EnemyType enemyType)
{
    foreach (var enemy in PooledEnemies)
    {
        if (enemy.CurrentEnemyType == enemyType)
        {
            PooledEnemies.Remove(enemy);
            return enemy.gameObject;
        }
    }
}

Now we can simply use this as we would an instantiated object.

randomEnemyType = Random.Range(0, 3) == 0 ? 1 : 0;
var enemy = GetEnemy((Enemy.EnemyType)randomEnemyType);
enemy.transform.position = new Vector3(Random.Range(0,100), Random.Range(0,100), enemy.transform.position.y, 0f);
enemy.SetActive(true);
var enemyComponent = enemy.GetComponent<Enemy>();
enemyComponent.Init();
TrackedActiveEnemies.Add(enemyComponent);

Returning the Object to the Pool

We can use a function like the one below to return a used object to the pool after we are done with it.

public void RemoveEnemy(Enemy enemy)
{
    enemy.gameObject.SetActive(false);

    TrackedActiveEnemies.Remove(enemy);
    PooledEnemies.Add(enemy);
}

Simply call RemovePooledEnemy() wherever needed.

Manager.Instance.RemoveEnemy(this);

Re-using Objects

Most of the quirks that you’ll encounter from pooling GameObjects like this stem from figuring out how to reset everything nicely. Unity doesn’t run most code on disabled objects; it’s usually preferable to reset things on Init to avoid unexpected behavior.



Source

Itch.io


It helps me if you share this post

Published 2024-02-07 06:00:00

Unity Shaders Intro Part 2: HLSL/CG | Edge Distortion Effects

I recently saw these UI effects in a game called Cult of the Lamb and they were very satisfying to watch. Let’s learn how to create our own types of effects like these.

Prerequisites

  • Unity (I’m using 2022.3.17f)
  • Photo editing software (Aseprite, Photoshop, etc)
  • Seamless perlin noise generator for the noise texture we will need later

Base 2D Shader

Create a basic empty file with the ‘.shader’ extension in your Unity project or Right click > Shader > Standard Surface Shader

Shader "Custom/EdgeShader" 
{
	Properties 
	{
	}
	
	SubShader
	{		
		Pass 
		{
			CGPROGRAM
			ENDCG
		}
	}
}

We want to begin with a base shader to manipulate, so let’s start by displaying a sprite.

Our shader must expose it to the editor in order to set our texture. Add a line under our properties defining a main texture.

_MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}

And the variable under SubShader.

sampler2D _MainTex;
float4 _MainTex_ST;

The _ST value will contain the tiling and offset fields for the material texture properties. This information is passed into our shader in the format we specified.

Now define the vertex and fragment functions.

struct vct 
{
	float4 pos : SV_POSITION;
	float2 uv : TEXCOORD0;
};

vct vert_vct (appdata_base v) 
{
	vct o;
	o.pos = UnityObjectToClipPos(v.vertex);
	o.uv = TRANSFORM_TEX(v.texcoord, _MainTex);
	return o;
}

fixed4 frag_mult (vct i) : COLOR 
{
	fixed4 col = tex2D(_MainTex, i.uv);
	col.rgb = col.rgb * col.a;
	return col;
}

Simple enough.

…or is it? That doesn’t look like it’s working properly. Let’s fix it.

We can add a Blend under our tags to fix the transparency issue.

Blend SrcAlpha OneMinusSrcAlpha

And we can just add the color property to our shader. At this point, we can display 2D sprites on the screen, yay!

Shader "Custom/EdgeShaderB" 
{
    Properties 
    {
        _MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
    }
    
    SubShader
    {		
        Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
        Blend SrcAlpha OneMinusSrcAlpha
        
        Pass 
        {
            CGPROGRAM
            #pragma vertex vert_vct
            #pragma fragment frag_mult 
            #include "UnityCG.cginc"

            sampler2D _MainTex;
            float4 _MainTex_ST;
            
            struct vct 
            {
                float4 vertex : POSITION;
                fixed4 color : COLOR;
                float2 texcoord : TEXCOORD0;
            };

            vct vert_vct(vct v)
            {
                vct o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.color = v.color;
                o.texcoord = v.texcoord;
                return o;
            }

            fixed4 frag_mult (vct i) : COLOR
            {
                fixed4 col = tex2D(_MainTex, i.texcoord) * i.color;
                return col;
            }

            ENDCG
        }
    }
}

Now we can start messing with things.

Edge Distortion Shader

We want to add some movement and distortion to our sprite. Begin with movement.

How can we manipulate our shader pixels? Let’s show an example by modifying our main texture. We’ll simply change the position. To do so, we can do something simple like shifting the texture coordinate down and to the left.

fixed4 frag_mult (vct i) : COLOR
{
	float2 shift = i.texcoord + float2(0.15, 0.25);
	fixed4 col = tex2D(_MainTex, shift) * i.color;

	return col;
}

Okay, now how about some movement?

fixed4 frag_mult (vct i) : COLOR
{
	float2 shift = i.texcoord + float2(cos(_Time.x * 2.0) * 0.2, sin(_Time.x * 2.0) * 0.2);
	fixed4 col = tex2D(_MainTex, shift) * i.color;

	return col;
}

If you examine your sprite at this point, you may notice some odd distortion as it moves.

Set your sprite’s import settings correctly!
Mesh Type: Full Rect
Wrap Mode: Repeat

Once you ensure your sprite has the correct import settings, it’s time to introduce our final 2d sprite we want to manipulate with the shader to achieve our effect.

This image will greatly change the shader appearance, and you should try different gradients and patterns. Here’s my image scaled up:

But I recommend using the smallest resolution that looks good for your project due to memory and performance.

yes it’s that small (12×12)

We also need a seamless noise texture, for the distortion.

Let’s add another variable for it.

_NoiseTex ("Base (RGB) Trans (A)", 2D) = "white" {}

Once we’ve assigned our noise texture, it’s time to start moving it.

fixed4 frag_mult (vct i) : COLOR
{
	float2 shim = i.texcoord + float2(
		tex2D(_NoiseTex, i.vertex.xy/500 - float2(_Time.w/60, 0)).x,
		tex2D(_NoiseTex, i.vertex.xy/500 - float2(0, _Time.w/60)).y
	);
	fixed4 col = tex2D(_MainTex, shim) * i.color;
	return col;
}

Now, add the static sprite to its left in the same color and connect it vertically.

Adjusting the transparency will function as expected, so we could overlay this.

Shader "Custom/EdgeShader" 
{
    Properties 
    {
        _MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
        _NoiseTex ("Base (RGB) Trans (A)", 2D) = "white" {}
    }
    
    SubShader
    {		
        Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
        Blend SrcAlpha OneMinusSrcAlpha 
        
        Pass 
        {
            CGPROGRAM
            #pragma vertex vert_vct
            #pragma fragment frag_mult 
            #include "UnityCG.cginc"

            sampler2D _MainTex;
            sampler2D _NoiseTex;
            float4 _MainTex_ST;
            float4 _NoiseTex_ST;
            
            struct vct 
            {
                float4 vertex : POSITION;
                fixed4 color : COLOR;
                float2 texcoord : TEXCOORD0;
            };

            vct vert_vct(vct v)
            {
                vct o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.color = v.color;
                o.texcoord = v.texcoord;
                return o;
            }

            fixed4 frag_mult (vct i) : COLOR
            {
                    float2 shim = i.texcoord + 
                float2(tex2D(_NoiseTex, i.vertex.xy/500 - float2(_Time.w/60, 0)).x,
                tex2D(_NoiseTex, i.vertex.xy/500 - float2(0, _Time.w/60)).y);
                fixed4 col = tex2D(_MainTex, shim) * i.color;
                return col;
            }

            ENDCG
        }
    }
}

Crown Shader

Here’s my quick little crown sprite.

Let’s make it evil.

We can repurpose the wall shader we just created and scale down the distortion as well as smoothing it

fixed4 frag_mult(v2f_vct i) : COLOR
{
    float2 shim = i.texcoord + float2(
        tex2D(_NoiseTex, i.vertex.xy/250 - float2(_Time.w/7.2, 0)).x,
        tex2D(_NoiseTex, i.vertex.xy/250 - float2(0, _Time.w/7.2)).y
    )/ 20;

    fixed4 col = tex2D(_MainTex, col) * i.color;

    return col;
}

Then we can add another pass to handle the normal sprite display.

Shader "Custom/CrownShader" 
{
    Properties 
    {
        _MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
        _NoiseTex ("Base (RGB) Trans (A)", 2D) = "white" {}
        _SpriteColor ("Color Tint Mult", Color) = (1,1,1,1)
    }
    
    SubShader
    {
        Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
        Blend SrcAlpha OneMinusSrcAlpha
        
        Pass 
        {
            CGPROGRAM
            #pragma vertex vert_vct
            #pragma fragment frag_mult 
            #pragma fragmentoption ARB_precision_hint_fastest
            #include "UnityCG.cginc"

            sampler2D _MainTex;
            sampler2D _NoiseTex;
            float4 _MainTex_ST;
            float4 _NoiseTex_ST;

            struct vct
            {
                float4 vertex : POSITION;
                float4 color : COLOR;
                float2 texcoord : TEXCOORD0;
            };

            vct vert_vct(vct v)
            {
                vct o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.color = v.color;
                o.texcoord = v.texcoord;
                return o;
            }

            fixed4 frag_mult(vct i) : COLOR
            {
                float2 shim = i.texcoord + float2(
                    tex2D(_NoiseTex, i.vertex.xy/250 - float2(_Time.w/7.2, 0)).x,
                    tex2D(_NoiseTex, i.vertex.xy/250 - float2(0, _Time.w/7.2)).y
                )/ 20;

                shim *= float2(0.97, 0.91);
                shim -= float2(0.01, 0);

                fixed4 col = tex2D(_MainTex, shim) * i.color;
                return col;
            }
            
            ENDCG
        } 
        Pass 
        {
            CGPROGRAM
            #pragma vertex vert_vct
            #pragma fragment frag_mult 
            #pragma fragmentoption ARB_precision_hint_fastest
            #include "UnityCG.cginc"

            sampler2D _MainTex;
            sampler2D _NoiseTex;
            float4 _MainTex_ST;
            float4 _NoiseTex_ST;

            float4 _SpriteColor;

            struct vct 
            {
                float4 vertex : POSITION;
                float4 color : COLOR;
                float2 texcoord : TEXCOORD0;
            };

            vct vert_vct(vct v)
            {
                vct o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.color = v.color;
                o.texcoord = v.texcoord;
                return o;
            }

            fixed4 frag_mult(vct i) : COLOR
            {
                float2 uv = i.texcoord;
                uv -= 0.5;
                uv *= 1.1;
                uv += 0.5;

                fixed4 col = tex2D(_MainTex, uv);
                col.rgb = _SpriteColor.rgb;

                return col;
            }
            
            ENDCG
        } 
    }
}

Source


It helps me if you share this post

Published 2024-01-26 06:00:00

Unity Shaders Intro Part 1: Shader Graph | Creating Player Highlight / Obscuring Area Effect Mask Shader

Shaders can be a useful way to enhance the visual presentation of your project through subtle or otherwise effects. Beyond code, the engine provides a built in visual scripting tool to create shaders from version 2019 onwards.

We will create an effect that allows us to highlight the player and obscure the rest of our stage. With scripting, we can also modify our exposed shader properties to adjust the intensity of the transparency effect, and transition to having no highlight. Examples will be shown later in the post.

Prerequisites

Ensure you have the Shader Graph package installed in your version of Unity. I am using 2022.3.17f for this post.

Creating the Shader

Right click in your Unity Project and do Create > Shader Graph > Blank Shader Graph

Now that we have a Shader Graph file, simply open the editor by double clicking it.

Let’s add some basic shader properties first. Navigate to the Graph Settings and add Built In as a target. We want the ability to control the transparency of our pixels, so also add the Alpha property to our fragment.

In order to properly utilize the Alpha property, we will need to edit the Built In settings Surface Type to Transparent.

Shader Inputs

The first thing to consider is the Player’s world position. Since we want the highlight effect to follow the player, we’ll need some sort of input into the shader.

In the Shader Graph editor, ensure the ‘Blackboard’ option is checked and visible, then click the plus button on the left side of the editor to create an input variable. Make it a Vector3 category. The ‘Name’ is for visual purposes, and the ‘Reference’ field will allow scripts access to the property. Give that some value like “_PlayerPosition” and drag it into the stage.

Since that’s simply a Vector, we need to translate that into something usable for our shader. We need to subtract the input player position from our world position so we can get the individual area to affect.

Right click, and create a Position and Subtract node.

Connect the player position and world position node to the subtract node. At this point your graph should look similar to below.

Next we will need a Length node to translate our position into a distance.

At this point, if we connect the output of our length to our Base Color on our Fragment, we can see a strange divine light.

How can we control the actual effect size?

We need a multiply node and some additional input here to control the highlight amount.

Let’s create a new Multiply node, and a Float input.

Name the Float input something like _EffectStrength, and feed the length output into the new multiply node.

You should have something similar to this, and the shader will go black again. This is simply because we haven’t given it an effect strength yet.

Save this Shader Graph asset and assign it to an object in our scene if you haven’t already.

Notice the warning. This refers to the fact that we aren’t rendering a sprite. This is correct, and can be safely ignored.

Assuming a reference to the sprite renderer component, we can then use the material set property functions to pass along our game values in an Update function or whenever needed.

RevealBG.material.SetVector("_PlayerPosition", position);
RevealBG.material.SetFloat("_EffectStrength", highlightingPlayerAmount);

Set the effect to something visible like 1 for now. We can also set a default through the Shader Graph editor.

All of this grey is pretty boring, so let’s add some color. The ability to edit our colors through scripting is pretty important, so let’s create two new Color variables.

The shader will lerp between these two colors for the highlight effect. We could use only one color considering our goal of mixing the effect with transparency, but the additional color gives more control over the effect appearance.

Create a Lerp node. Connect the output of the previous multiply node to the lerp T input, and the two new colors to the A and B inputs, respectively.

I set BGColor to blue, and PlayerRevealColor to red through the graph inspector to clearly show the shader effect.

If all goes well, you should have a circular gradient in the input colors you’ve specified.

And something like this in your Shader Graph.

That gradient isn’t really the look we want. Instead, we want a tight circular highlight around the player position.

To achieve this, we can add a Step node.

Insert it between the multiply and lerp node at the end, and it will produce a gated circular output.

Adjusting the EffectStrength should make the circle appear larger. Try values from 0 -> 1. Above 1 will make the highlight smaller.

0.5 effect setting
EffectStrength at 0.5
EffectStrength at 0

Now we just need to connect our transparency logic.

Add another Multiply node that we will use for the Alpha property on the Fragment. The input should be our previous multiply node’s output, before the Step node. This allows control over the strength of the highlight fade. I went with 1.5.

You’re pretty much finished!


We can adjust the colors to do screen wave effects like this that could be enhanced with particle effects.

Or as a game over effect where you hide the rest of the stage and highlight the player. I added a purple background sprite behind the player to show the masking effect.

Force fields, lights for dark mazes etc all follow a similar concept.


Source


It helps me if you share this post

Published 2024-01-21 06:00:00

Pure JavaScript Asteroids Clone with Enemy Ships Source Code

There are many acceptable JavaScript game engines out nowadays, but often you can get good performance from writing your own simple engine or renderer depending on your use case. The code for this project will be on my GitHub linked below.

What goes into writing a game engine?

Ideally, we want to handle a few important things.

  1. States, whether that be states of objects (alive, dead, moving, the type of enemy)
  2. Rendering
  3. Spawnable objects (with previously mentioned states)
  4. Input
  5. Save data

We approach this task with an object-oriented mindset instead of a functional programming mindset. Although there are a few global variables such as the overall running game state or the object pool arrays, most of the memory or information we need to remember occurs on a per-object basis.

We will be using a ‘Canvas‘ to draw our simple asteroid graphics. Writing a 3d renderer in JS is a much more complex task, although libraries like threeJS exist to get you started.

To begin with, we want to define a Vector2D class that we can reuse throughout our game. I’m familiar with Unity so I imagine an implementation similar to their engine’s GameObject setup, but any class that can read / write an X and Y will work.

var Vec2D = (function() {
var create = function(x, y) {
        var obj = Object.create(def);
        obj.setXY(x, y);

        return obj;
    };

    var def = {
        _x: 1,
        _y: 0,

        getX: function() {
            return this._x;
        },

        setX: function(value) {
            this._x = value;
        },

        getY: function() {
            return this._y;
        },

        setY: function(value) {
            this._y = value;
        },

        setXY: function(x, y) {
            this._x = x;
            this._y = y;
        },

        getLength: function() {
            return Math.sqrt(this._x * this._x + this._y * this._y);
        },

        setLength: function(length) {
            var angle = this.getAngle();
            this._x = Math.cos(angle) * length;
            this._y = Math.sin(angle) * length;
        },

        getAngle: function() {
            return Math.atan2(this._y, this._x);
        },

        setAngle: function(angle) {
            var length = this.getLength();
            this._x = Math.cos(angle) * length;
            this._y = Math.sin(angle) * length;
        },

        add: function(vector) {
            this._x += vector.getX();
            this._y += vector.getY();
        },

        sub: function(vector) {
            this._x -= vector.getX();
            this._y -= vector.getY();
        },

        mul: function(value) {
            this._x *= value;
            this._y *= value;
        },

        div: function(value) {
            this._x /= value;
            this._y /= value;
        }
    };

    return {
        create: create
    };
}());       

This will allow us to reference positions easier. It’s vital to implement a few capabilities for our renderer. One important need is to be able to draw an object to our canvas at a specified position, and have the capability to clear said canvas, preparing for the next frame the game renders.

To draw a line, we can write JavaScript such as:

var c = document.getElementById("canvas");
var ctx = c.getContext("2d");
ctx.moveTo(0, 0);
ctx.lineTo(200, 100);
ctx.stroke();

And if we wanted to clear our canvas, we can use clearRect:

ctx.clearRect(0, 0, canvas.width, canvas.height);

We can define a render function to handle our different objects.

window.getAnimationFrame =
    window.requestAnimationFrame ||
    window.webkitRequestAnimationFrame ||
    window.mozRequestAnimationFrame ||
    window.oRequestAnimationFrame ||
    window.msRequestAnimationFrame ||
function(callback) {
    window.setTimeout(callback, 16.6);
};
render(){
    context.clearRect(0,0,screenWidth,screenHeight);
    renderShips();
    renderAsteroids();
    renderBullets();
    getAnimationFrame(loop);
}

renderShips(){
    ship.renderSelf();
    for (int i = 0; i < enemies.length; i++)
    enemies.renderSelf();
}
...etc

Then an example render self function:

renderSelf: function() {
    if (this.hasDied)
        return;
    context.save();
    context.translate(this.pos.getX() >> 0, this.pos.getY() >> 0);
    context.rotate(this.angle);
    context.strokeStyle = playerColor;
    context.lineWidth = (Math.random() > 0.9) ? 4 : 2;
    context.beginPath();
    context.moveTo(10, 0);
    context.lineTo(-10, -10);
    context.lineTo(-10, 10);
    context.lineTo(10, 0);
    context.stroke();
    context.closePath();

    context.restore();
}

Which would render our object assuming a class holding some variables with our Vector2 class we described earlier.

var Ship = (function() {
var create = function(x, y, ref) {
    var obj = Object.create(def);
    obj.ref = ref;
    obj.angle = 0;
    obj.pos = Vec2D.create(x, y);
    obj.vel = Vec2D.create(0, 0);
    obj.thrust = Vec2D.create(0, 0);
    obj.invincible = false;
    obj.hasDied = false;
    obj.radius = 8;
    obj.idleDelay = 0;
    obj.isSpectating = false;

    return obj;
};
...etc

We are handling rendering and state management from inside an object now. All that just for a triangle.

player ship

We aren’t done yet. Next we need to handle Input. The goal with creating object classes is reusability and extensibility. We don’t need to spawn multiple instances of an input, so we can handle that globally. Your Input function may look something like this:

window.onkeydown = function(e) {
    switch (e.keyCode) {
        //key A or LEFT
        case 65:
        case 37:
            keyLeft = true;
            break;
            //key W or UP
        case 87:
        case 38:
            keyUp = true;
            break;
            //key D or RIGHT
        case 68:
        case 39:
            keyRight = true;
            break;
            //key S or DOWN
        case 83:
        case 40:
            keyDown = true;
            break;
            //key Space
        case 32:
        case 75:
            keySpace = true;
            break;
            //key Shift
        case 16:
            keyShift = true;
            break;
    }

    e.preventDefault();
};

window.onkeyup = function(e) {
    switch (e.keyCode) {
        //key A or LEFT
        case 65:
        case 37:
            keyLeft = false;
            break;
            //key W or UP
        case 87:
        case 38:
            keyUp = false;
            break;
            //key D or RIGHT
        case 68:
        case 39:
            keyRight = false;
            break;
            //key S or DOWN
        case 83:
        case 40:
            keyDown = false;
            break;
            //key Space
        case 75:
        case 32:
            keySpace = false;
            break;
            //key Shift
        case 16:
            keyShift = false;
            break;
    }

    e.preventDefault();
};

e.preventDefault() will stop users from accidentally hitting keys such as ctrl + L and losing focus from the window, or jumping the page with Space, for instance.

function updateShip() {
    ship.update();

    if (ship.hasDied) return;

    if (keySpace) ship.shoot();
    if (keyLeft && keyShift) ship.angle -= 0.1;
    else if (keyLeft) ship.angle -= 0.05;
    if (keyRight && keyShift) ship.angle += 0.1;
    else if (keyRight) ship.angle += 0.05;

    if (keyUp) {
        ship.thrust.setLength(0.1);
        ship.thrust.setAngle(ship.angle);
    } else {
        ship.vel.mul(0.94);
        ship.thrust.setLength(0);
    }

    if (ship.pos.getX() > screenWidth) ship.pos.setX(0);
    else if (ship.pos.getX() < 0) ship.pos.setX(screenWidth);

    if (ship.pos.getY() > screenHeight) ship.pos.setY(0);
    else if (ship.pos.getY() < 0) ship.pos.setY(screenHeight);
}

...etc

function checkDistanceCollision(obj1, obj2) {
    var vx = obj1.pos.getX() - obj2.pos.getX();
    var vy = obj1.pos.getY() - obj2.pos.getY();
    var vec = Vec2D.create(vx, vy);

    if (vec.getLength() < obj1.radius + obj2.radius) {
        return true;
    }

    return false;
}

...etc

Once we have the ability to render a reusable object to a canvas and read / write a position that can be checked, we use that as a template to create other objects (particles, asteroids, other ships).

hexagon asteroid
enemy ship example

You can make interesting graphics with just basic shapes. We handle collision by assigning either an xWidth and yWidth + xOffset and yOffset, OR a radius. This again would be assigned to the object itself to keep track of.

asteroids game example

Further Techniques

If we can control the rendering manually we can leave an ‘afterimage’ on our canvas before rendering the next frame as opposed to clearing it entirely. To do this, we can manipulate the canvas’ global alpha.

// Get the canvas element and its 2D rendering context
const canvas = document.getElementById('myCanvas');
const ctx = canvas.getContext('2d');
// Set the initial alpha value
let alpha = 0.1; // You can adjust this value to control the fading speed
// Function to create the afterimage effect
function createAfterimage() {
    // Set a semi-transparent color for the shapes
    ctx.fillStyle = `rgba(255, 255, 255, ${alpha})`;
    // Fill a rectangle covering the entire canvas
    ctx.fillRect(0, 0, canvas.width, canvas.height);
    // Decrease alpha for the next frame
    alpha *= 0.9; // You can adjust this multiplier for a different fade rate
    // Request animation frame to update
    requestAnimationFrame(createAfterimage);
}
// Call the function to start creating the afterimage effect
createAfterimage();

And a simple localStorage can be used to save scores.

function checkLocalScores() {
    if (localStorage.getItem("rocks") != null) {
        visualRocks = localStorage.getItem("rocks");
    }
    if (localStorage.getItem("deaths") != null) {
        visualDeaths = localStorage.getItem("deaths");
    }
    if (localStorage.getItem("enemyShips") != null) {
        visualEnemyShips = localStorage.getItem("enemyShips");
    }
    updateVisualStats();
}
function saveLocalScores() {
    localStorage.setItem("rocks", visualRocks);
    localStorage.setItem("deaths", visualDeaths);
    localStorage.setItem("enemyShips", visualEnemyShips);
}

End Result

You can see and play the game here.

Source code is here. ✨


It helps me if you share this post

Published 2023-11-30 23:51:07

AI Music Generation: MusicGen

Researchers have recently released a new paper and subsequent model, “Simple and Controllable Music Generation”, where they highlight it “is comprised of a single-stage transformer LM together with efficient token interleaving patterns, which eliminates the need for cascading several models”. What this essentially means in practice is the music generation can now be completed in less steps, and is getting more efficient as we make progress on various different types of models.

I expect AI to hit every industry in an increasingly rapid pace as more and more research becomes available and progress starts leapfrogging based on other models. MUSICGEN was trained with about 20K hours of unlicensed music, and the results are impressive.

Here are some interesting generations I thought sounded nice. As more models from massively trained datasets hit the public, we will see more community efforts and models as well just like with art.

Medium Model

I used the less performant medium model (1.5B parameters and approx 3.7 GB) to demonstrate how even on relatively poor hardware you could achieve reasonable results. Here is some lofi generated from the medium model.

Large Model

A step up is the 6.5 GB model. This produce slightly better sounding results.

What is that melody?

There is also a ‘Melody’ model that is a refined 1.5B parameter version.

Limitations

There are a few limitations on this model, namely the lack of vocals.

Limitations:

  • The model is not able to generate realistic vocals.
  • The model has been trained with English descriptions and will not perform as well in other languages.
  • The model does not perform equally well for all music styles and cultures.
  • The model sometimes generates end of songs, collapsing to silence.

However, future models and efforts will remedy these points. It’s only a matter of time before a trained vocal model is released with how fast machine learning advancements are accelerating.


It helps me if you share this post

Published 2023-06-10 18:36:40

Starbound 1.4.4 Source Code

Starbound has been one of my favorite games of all time, so I’m happy to say that I have the latest Starbound source code, last commit August 7th, 2019. I will not be explaining how I got these files. It is the actual source, not just a decompilation, and as such includes build scripts, unused stuff, old migration code, comments, a stored test player, etc.

Source Screenshots

The source has minimal comments, and the structure is reasonable. I found the code easy to read and understand, but perhaps that’s because I’ve been modding Starbound for years now and am familiar with its behavior.

Languages Breakdown (GitHub)

StarEnvironmentPainter.cpp

StarEnviroment.cpp preview

StarMixer.cpp (audio related)

StarMixer.cpp source preview

StarTools.cpp

StarTools.cpp source preview

Building

And of course, we can build it. I compiled this version without Steam API or the Discord rich presence API, but those are easily included.

Skip to 1:10 to see the game launch

Funny Developer Comments

Here’s a look at some of the best (in my opinion) developer comments in the source. This is not intended to be a mockery, far from it, I’m ecstatic I can take a peek into the minds of the developers. Enjoy.

// message is fullbody encrypted so the response is trust worthyish
// message is fullbody encrypted so the response is trust worthyish

// Meh, padding is hard-coded here
// Meh, padding is hard-coded here

// TODO: I hate these hardcoded values.  Please smite with fire.
// TODO: I hate these hardcoded values. Please smite with fire.

// TODO: Get rid of this stupid fucking bullshit, this is the ugliest
// fragilest pointlessest horseshit code in the codebase.  It wouldn't
// bother me so bad if it weren't so fucking easy to do right.
// TODO: Get rid of this stupid fucking bullshit, this is the ugliest
// fragilest pointlessest horseshit code in the codebase. It wouldn’t
// bother me so bad if it weren’t so fucking easy to do right.

// This was once simple and elegant and made sense but then I made it
// match the actual platform rendering more closely and now it's a big
// shitty pile of special cases again. RIP.
// This was once simple and elegant and made sense but then I made it
// match the actual platform rendering more closely and now it’s a big
// shitty pile of special cases again. RIP.

Example: Simple Re-implementation of Vapor Trail and Sitting Toolbar Usage

At some point during development, Chucklefish had the idea to add a vapor trail when the player was falling fast. I could’ve sworn I saw a post on their news about it back when the game was in beta, but I can’t find it now. Anyway, we can add a small snippet to restore it, as an example of further engine work Starbound can benefit from.

// Vapor trail
if (m_movementController->velocity()[1] < -50) {
  m_vaporTrailTimer += WorldTimestep;
  if (m_vaporTrailTimer > 1)
      m_humanoid->setVaporTrail(true);
  }else{
  m_vaporTrailTimer = 0;
  m_humanoid->setVaporTrail(false);
}

By adding this snippet, we can see what it was roughly meant to look like.


We can also modify Player restrictions such as

bool Player::canUseTool() const {
  return !isDead() && !isTeleporting() && !m_techController->toolUsageSuppressed() && m_state != State::Lounge;
}

to just

return !isDead() && !isTeleporting() && !m_techController->toolUsageSuppressed();

Allowing us to use our inventory while sitting down

Further Thoughts

Future work on the engine can lead to further modding capabilities and engine optimizations. There are many potential client side performance improvements that could be made without touching any network code. This would maintain compatibility with the vanilla client. The netcode could be updated as well, but this would break compatibility once major changes were made. If both (or more) parties are willing to use a modified client, any theoretical modification could be made. The possibilities are endless.

As of 2024, there now exists a few Starbound open source community projects with the aim of enhancing the base game’s experience. : )


It helps me if you share this post

Published 2023-05-27 04:55:45