Saturday, September 23, 2023

Gravity Simulator #1,000,006

I've written a lot of gravity simulators. This one is the first time I've built one into a game engine. First, I created a Node2D and a Sprite2D with a texture so it's visible. I then attached the following code to the sprite node:


extends Sprite2D


var v:Vector2=Vector2(100,0)
var center:Vector2=Vector2(500,200)
var mu=3e6


# Called when the node enters the scene tree for the first time.
func _ready():
	var vec=Vector2(500,0)
	global_position=vec


# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta):
	var r:Vector2=(global_position-center)
	var a:Vector2=-mu*r/r.length()**3
	v+=a*delta
	global_position+=v*delta
The _process() function is a very simple Euler integrator. The acceleration is the standard test-particle gravity equation of motion, where the center is considered to be infinitely heavier than the test particle, such that the gravity of the test particle doesn't affect the motion of the center. 

The next obvious things to do:
  • Use Runge-Kutta instead of Euler
  • Gravitate towards another sprite instead of an invisible point
  • Give both points finite mass 
  • Gravitate towards any number of masses, to make an N-body simulator.

It looks like I might be right about how to do physics in Godot, but it still doesn't feel quite right. Godot includes ragdoll physics and joints. In order to integrate custom physics, I would have to be able to generate custom forces and moments, instead of directly doing the numerical integration myself. I expect something more like:

# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta):
    var r:Vector2=(global_position-center)
    var F:Vector2=-mu*global_mass*r/r.length()**3
    add_force(F)
    var M=... #calculate moment on the object
    add_moment(M)
This way if the engine does a higher-order integrator or has its own forces, this slides in nicely.

func _process(delta)

 I wish that GodotScript was just Python. It isn't, for a reason, but I don't know the reason yet. It could be:

  1. GodotScript came first, or before Python was widely known.
  2. Python is too hard to integrate. In this case, they considered *creating an entire new language interpreter* easier than incorporating Python
  3. True Python implies that the entire marketplace of Python libraries can be used, and that may have been too difficult to achieve.
  4. Python doesn't match the authors' internal model for how to do scripting
  5. Godot was started as a personal project, and a scripting engine was one of the "fun" things that they wanted to do with it.
In any case, I have reached the part of the tutorial where scripting is introduced. The template for a script for a node includes the following interesting code:

extends Node2D


# Called when the node enters the scene tree for the first time.
func _ready():
pass # Replace with function body.


# Called every frame. 'delta' is the elapsed time since the previous frame.
func _process(delta):
pass


The interesting one is _process(). I can imagine a mode of using the game engine where I just use it as a graphics engine -- every frame, this function is executed by the graphics engine, where it does all of my custom physics etc, changes the position properties of a bunch of nodes, then returns and lets the graphics part of the engine do its thing. 

Seriously, this function can implement a Runge-Kutta numerical integrator from the ground up -- it could use the node properties as part of its state, can keep global or static variables for the rest, and implement any law of motion I can think of. I'm not smarter than Newton or Euler and I intend to use their laws, but I don't *have* to.

If this was all Godot did for me, it might be just right. I'm still exploring, seeing what else it can do for me (I'm on part 25 of 47 of the Intro to Godot course). I strongly suspect that the engine does have a physics component though, since I have seen "gravity" as a property of some of the nodes.


Friday, September 22, 2023

Fallout from the Unity Disaster

 I have always been a low-level kind of programmer. I have a good intuitive sense of How Things Work, but I tend to build my own mental abstractions on top of these and I often have trouble learning other peoples' abstractions.

For instance, one of my unfinished projects is making math explainer videos. I knew about Manim from 3blue1brown, but he uses a completely different mental model of how animations work. It's tough to argue with success -- he has hundreds of videos and millions of subscribers, while I have zero and zero. Plus, I could never get the dancing equations to work as well in PictureBox as he does with TransformMatchingTex().

Similarly, I have always wanted to get into 3D simulator development -- not necessarily a game, more like a virtual environment where I can program virtual robots. One of my other long-term unfinished goals is to animate the launch of various historical spacecraft, much like Jim Blinn animated the flybys of Voyager. I want to be able to write a physically realistic rocket guidance program similar to what a Titan would have actually used, and use that to create a physically accurate, properly scaled animation of the launch. The legend goes that the second Voyager launch (Voyager 1, don't ask) only had about two seconds of fuel left in the upper stage when it finished the boost. How much did the first launch (Voyager 2) have? Is two seconds a lot? This can't really be answered without proper scale.

Anyway, one of the things I have been avoiding as someone else's abstraction, is game engine design and usage. I always thought myself capable of writing my own, even though I kept bouncing off of OpenGL and *its* abstractions. I have written simulator engines any number of times, basically just fancy numerical integrators. I have never learned anyone's game engine -- I have never explored anyone else's abstractions.

In the last couple of weeks, the Unity game engine has claimed authority to alter its relation with its developers. 



Lets just say that the development community hasn't taken it well.

One aspect of the fallout of this announcement is a mad scramble to other engines. It might be too late for programs which are years along and too intertwined with Unity, but many many people are looking at alternatives, and one of the ones that came up is Godot (https://godotengine.org/). This one is free and open source, so it can't help but stay free.

Even though it is free, I did spend about $26 on an online course for it (https://www.humblebundle.com/software/everything-you-need-to-know-about-godot-4-encore-software). The biggest thing I am looking for is how well their abstractions match up with my own. This will be interesting, as I'm not even sure what I expect a game engine to do for me. How does the physics part work? Can I write my own physics? Is there a numerical integrator underlying things? Are translational and rotational kinematics already implemented, and I just have to write my own dynamics? The engine might do too little, and require me to implement my own numerical integrator, or it might do to much, and implement things in such a way that I can't do a rocket.


Wednesday, May 31, 2023

Velocicoaster

 On Friday, May 26, I rode the Velocicoaster in Universal Florida. I almost didn't for a couple of reasons, but I am glad I did. I'm not quite as sure I would go on it again, but I did offer to do so with my niece once she heard that I went on it.

Monday, April 17, 2023

Shipometer proge

 We are on Shipometer 23.04 now. Shipometer 23.03 is a failure, for perhaps multiple reasons.

First, the shipometer is a HAT for a Raspberry Pi that can carry a ZED-F9R GPS+IMU, a BME280 pressure/temperature/humidity sensor, and an ICM20948 9DoF sensor. Even though the GPS+IMU has an IMU built into it, I want the ICM20948 because I can control it at a lower level, and because it has a magnetometer in it as well. 

Finally, it carries an LPC210x as a precision timer. The ZED-F9R generates a pulse-per-second (PPS) which is accurately timed to have its rising edge right at the top of each second, with sub-microsecond accuracy. This PPS is routed to and captured by the Pi, but is also routed to the timer capture input of the LPC210x. This microprocessor is an old ARM7TDMI, but has a hardware 32-bit timer capable at running at 60MHz, with multiple usable capture inputs. The LPC210x is programmed to count at 60MHz, reset after 3.6 billion cycles, capture several inputs, and output the exact timer count of each pulse to its serial port, which is wired to the Pi UART1. This way, the time of each PPS and sensor data-ready signal can be recorded with no latency due to things like interrupts and real-time software.

Wednesday, February 1, 2023

Jewel of the whenever: Matrix Multiplication

Eigenchris (I kinda wish I thought of that first) has a series of videos on tensors. I can't give my opinion on the series yet, because I haven't seen them all yet. However, he does something cool that I have never seen before as a mnemonic for matrix multiplication (start at 4:32):
 

We arrange the multiplication as follows. The final product goes in the bottom right. To the left of that, we put the left matrix, and *above* that we put the right matrix. Each cell is then the dot product of the row vector it is on and the column vector it is on

\[\begin{matrix}  & \begin{bmatrix}    . & w_0 & . & . \\    . & w_1 & . & . \\    . & w_2 & . & . \\\end{bmatrix}\\ \begin{bmatrix} v_0 & v_1 & v_2 \\ . & . & \end{bmatrix}& \begin{bmatrix} . & \vec{v}\cdot\vec{w} & . & . \\ . & . & . & . \end{bmatrix}\end{matrix}\]

Friday, September 9, 2022

The One Picture that explains Phase Locked Loops

 A Phase Locked Loop has always been mysterious to me until now. The following three pictures explain it all, and the third one is where the light goes on. Here are the first two, to build dramatic tension and also to do the best job I have ever seen of explaining the block diagram of a PLL. It's from Shawn Hymel's series on FPGA programming.


First diagram:

The PLL consists of three sections:
  • The phase detector produces a signal based on whether the reference and fed back signals are in phase. I'm not sure of the details, but it might be something as simple as comparing both signals to zero (returning 1 if positive and 0 if negative) then XORing those comparisons. If the signals are in phase, they will always be on the same side of zero, and the phase detection output will be constant. If they are out of phase, sometimes they will be on opposite sides and the phase detection will not be constant.
  • The low-pass filter takes the phase detection signal, treats it as a PWM, and converts it to analog just by running it throug a resistor-capacitor (RC) circuit. The output is then some analog signal that is a function of the average of the phase detection signal.
  • The voltage-controlled oscilator (VCO) then takes that signal as an error signal. I'm sure it does some fancy PID magic, which finds just the right output signal to keep the input error signal at zero. It feeds this to the oscillator which then runs at the commanded frequency.
  • The output is fed back to the phase detector to produce a proper closed-loop control system.
In this diagram, the output of the VCO is significantly out of phase with the reference, *because* it is not the right frequency. It's impossible for two signals of different frequency to stay in phase.

Second diagram:

In this diagram, the output phase has locked. The error signal from the phase detector and LPF is zero, and the controller in the VCO knows that whatever settings it is using now are correct, and keeps them there. (Note that in this diagram, the clock is an analog sine wave. Just pretend it's a digital square wave.)

Third diagram, and critical part:


This shows a clock divider in the feedback part. Digital clock dividers are relatively easy to implement, requiring a counter. To divide by N, make a counter big enough to count to N. Each input clock, increment the counter, but when the counter is about to reach N, reset it instead. If N is even, then it's pretty easy to set up some logic so that whenever the counter is in the first half of its run, a low signal is output, and vice versa. Odd is a little bit trickier, but still doable.

Multipliers on the other hand are difficult, and in fact are why we need all this fancy PLL stuff to begin with. With a PLL and a *divider* in the feedback path, we can implement a *multiplier*.

If you put a divider on the input reference signal as well, you can get frequency multiplication by any rational factor.

Tuesday, August 16, 2022

Shipometer

 Having given up on #SoME2, it's time to move on to the next project. Next week I will be going on a cruise. On the cruise, and also on the plane, I wish to record GPS signals. The pocketometer has all the right sensors for that, but unfortunately is not sufficiently reliable. The Raspberry Pi has already proven itself capable of recording GPS from one of the ZED-F9R breakout boards. Now the question is if it can record the sensors on I2C, and the time pulse.

I want to see if I can do this with just the parts that I already have on my desk.

I have a belt bag big enough to hold all the sensors, the Pi, and a 20Ah USB battery pack. That would be a lot less suspicious than stuffing stuff in my pocket.

It would also be great if the Pi could act like a wifi hotspot and serve SSH through it. That way I could look at it on the phone while (literally) in flight.

The last thing that would be awesome is timer capture on the GPIO, of at least the PPS and maybe others, like the interrupt lines from the sensors. If it can't, maybe we could get a program on the Teensy that would do the timer capture and output on UART or as an I2C slave.


#SoME2 post-mortem

 I did not get a video out on time for SoME2. Even for just the descoped "good part" video, I couldn't get it done in time this morning.