Friday, October 25, 2013
Monday, October 21, 2013
IT WORKED!!!
I turned the Rocketometer over to the rocket guys back in July, and had no contact with it until after the flight today. It had been exposed to every environment it was going to see except for vacuum, but I still had no confidence that it would work.
Well, it did.
The first time I stuck the SD card into my computer after the flight, it said "what card?"
That was disappointing.
It took a few minutes for the computer to recognize the card, but when it did, I saw that it had recorded 419 MiB in 66 files. One of the last was also the longest, so I ran it through my quick extraction program, then through IDL, and saw the characteristic acceleration pulse during first and second stage.
The first thing I did after that was press the Victory button on my phone:
No one else in the lab either heard that or got it, so I had to shout, "It Worked!!!".
Now I have to analyze the data...
Well, it did.
The first time I stuck the SD card into my computer after the flight, it said "what card?"
That was disappointing.
It took a few minutes for the computer to recognize the card, but when it did, I saw that it had recorded 419 MiB in 66 files. One of the last was also the longest, so I ran it through my quick extraction program, then through IDL, and saw the characteristic acceleration pulse during first and second stage.
The first thing I did after that was press the Victory button on my phone:
Now I have to analyze the data...
Minimum mission success achieved!
At about 12:01:12MDT today, the Rocketometer achieved minimum mission success by riding above 100km and therefore reaching space.
It will be some time yet before I can recover the device to see that it worked, which will represent full mission success.
It will be some time yet before I can recover the device to see that it worked, which will represent full mission success.
Saturday, October 12, 2013
The Curiously Recurring Template Pattern
I was doing fine with the Rocketometer analysis code in C++, using the NewMat library to handle matrices, with a Quaternion layer on top that I wrote myself. After five days of doing battle with various things, I finally got something that worked, but I was curious if this was the "best" way to do it. The C++ Standard Template Library didn't have anything directly related to matrices. The Boost library had a section called uBLAS, but the documentation for it kind of de-recommended itself. It suggested several alternatives, and the one that looked best is called Eigen.
Eigen is interesting in that it is entirely header files, containing almost all of its code in C++ templates. Templates are cool, mostly because when they are instantiated, the compiler gets to see the code in the context that it is used, and gets to inline and optimize it there. Specifically, Eigen contains a dynamic-sized matrix, but also is a template for fixed-sized vectors and matrices. I want to use these as much as possible because all vector sizes used in Rocketometer data analysis are known at compile-time, so the compiler can unroll loops and so on to best optimize the code.
However, templates do not mix with virtual methods, so I had to figure out how to make that work, since I used virtual methods to implement the physics model. I had code that looks like this with NewMat:
class SimModel {But I wanted to adapt that to use Eigen, specifically with the fixed-length vectors, since the size of the state vector is determined by the problem and known at compile time. That means that ColumnVector has to go, to be replaced by Matrix<double,n,1> where n is a template parameter determining the size of the state vector. But what about the measurement? The purpose of the k parameter to g_only is to tell which of several kinds of measurements to use. For instance, in the Rocketometer problem, we have a measurement vector coming from the inertial and magnetic sensors, treated as a single 9-element vector. We also have measurements coming from radar or equivalent, treated as a 3-element vector. So, we need a template function g_only, which generates either a 9-element vector or a 3-element vector. You can't do that and have it be virtual, too. Basically, virtual functions are a runtime binding issue, while templates are a compile-time binding. So, I can't have a virtual g_only function, callable by the base class g function.
protected:
/** Physics function. Calculates the derivative of the state with respect to time
virtual ColumnVector fd_only(double t, const ColumnVector& x)=0;
/** Measurement function. Calculates the measurement from the given state and time */
virtual ColumnVector g_only (double t, const ColumnVector& x, int k)=0;
/** Physics function with process noise. Uses fd virtual function to calculate physics, then adds process noise.*/
ColumnVector fd(double t, const ColumnVector& x, const ColumnVector* v);
public:
/** Measurement function with measurement noise. Uses g virtual function to calculate measurement, then adds measurement noise. */
ColumnVector g (double t, const ColumnVector& x, int k, const ColumnVector* w) {ColumnVector result=g_only (t,x,k);if(w)result+=*w;return result;};
};
Enter the Curiously Repeating Template Pattern (CRTP). As it happens, this is something that I read about just a few days ago, just reading up on the C++ language in general. For us, the pattern goes something like this:
template<int n, class Derived> class SimModel {Note that g_only isn't even defined in this class template, only used. In fact, one of the weaknesses of CRTP is that it implies definitions without expressing them, so it is hard to document. Also note that extra template keyword there after the arrow operator. See here for details.
public:
template<int m> Matrix<double,m,1> g (double t, const Matrix<double,n,1>& x, int k, const Matrix<double,m,1>* w) {
Matrix<double,m,1> result=static_cast<Derived*>(this)->template g_only<m>(t,x,k);
if(w)result+=*w;
return result;
};
};
The derived class then looks like this:
template<int n> class RocketometerModel: public SimModel<n,class RocketometerModel<n>> {So what happens is that the compiler
template<int m> Matrix<double,m,1> g_only(double t, const Matrix<double,n,1>& x, int k);
};
- Parses the template for SimModel, but doesn't compile it, because it's a template, not actual code yet. Therefore it doesn't matter that g_only is undefined yet.
- Parses the template for RocketometerModel, and again doesn't compile it.
- Parses the main code, compiling as it goes along until it hits RocketometerModel. It instantiates and compiles RocketometerModel, in the process instantiating and compiling SimModel.
- When SimModel is being instantiated and compiled, it has a call to RocketometerModel g_only, but this is ok, since that is available already, since step 2 has already happened.
Now this part I will write in bold, so that Google can see it. The other curiously repeating template pattern is having to use the word .template when using (not defining) a template member function. This solves the error invalid operands of types ‘<unresolved overloaded function type>’ and ‘int’ to binary ‘operator<’ .
It appears that there is a weakness in the C++ operator precedence table, such that in certain cases it can't distinguish the opening angle bracket of a template parameter set from a normal compare-for-less-than. In order to disambiguate, you throw in the word template after the dot (it will also work after a pointer arrow -> if you are using one of those). I don't understand it completely myself, but Eigen uses this itself, which is how I found out about how it works in the first place.
I am gradually coming to the conclusion that Java did it right, with generics which are real compiled code. Java generics are enabled by the "everything is an object" model in which all objects descend from a common class. I am also beginning to think that Java did it right with operator overloading. Operator overloading, even when fully appropriate, like defining * to mean matrix or quaternion multiplication, is fun to use but a nightmare to implement. And, if it is implemented wrong, it might be left to the user to find out when he tries to do something the implementer did not foresee.
All in all, I give Eigen 4/5, especially recommended for new projects, but not for converting old projects. The biggest advantage is speed. What took IDL over an hour, took Java and C++ with NewMat about 4 minutes, but takes Eigen only 20 seconds. Also, templated matrix and vector sizes are nice, because they resolve matrix size mismatches at compile-time. Finally, zero-based component indexing is what I expect, and the reason I don't suggest converting old projects from NewMat. Also be aware that the Eigen quaternion library uses the convention \(\vec{v}_{to}=\mathbf{p}\vec{v}_{from}\mathbf{p}'\), which is fine and internally consistent, but not consistent with the math I had seen for quaternion derivatives. As a consequence, my code is liberally festooned with q.conjugate() and in some places q.conjugate().conjugate(). It's almost a case of two wrongs making a right, but not quite.
Thursday, October 10, 2013
Calibration
It's kinda weird, but it turns out that C++ is the best language for processing Rocketometer data. There is a cool library NewMat which creates appropriate operator overloads to do matrices in C++, and I have extended it to include quaternions. C++ was doing in minutes what it was taking IDL hours. However, it took me 5 days to translate from IDL to C++, so I had better process a lot of data to ever get that time back.
The time that the Rocketometer spends in zero gravity may be the most valuable calibration time I ever get. Zero acceleration, zero rotation, and I expect a wide range of temperatures.
Remember that the goal of this is to get something into space, but that goal requires no further effort on my part to achieve. The secondary goal is to be able to calibrate the data and report something useful to Tom. The long-term goal though is to measure the track of Space Mountain.
So I am putting something in Space to get it ready for a roller coaster.
That made me laugh.
The time that the Rocketometer spends in zero gravity may be the most valuable calibration time I ever get. Zero acceleration, zero rotation, and I expect a wide range of temperatures.
Remember that the goal of this is to get something into space, but that goal requires no further effort on my part to achieve. The secondary goal is to be able to calibrate the data and report something useful to Tom. The long-term goal though is to measure the track of Space Mountain.
So I am putting something in Space to get it ready for a roller coaster.
That made me laugh.
Subscribe to:
Posts (Atom)