Gurus out there:
The differential equations for modeling spacecraft motion can be described in terms of a collection of acceleration terms:
d2r/dt2 = a0 + a1 + a2 + ... + an
Normally a0 is the point mass acceleration due to a body (a0 = -mu * r/r^3); the "higher order" terms can be due to other planets, solar radiation pressure, thrust, etc.
I m implementing a collection of algorithms meant to work on this sort of system. I will start with Python for design and prototyping, then I will move on to C++ or Fortran 95.
I want to design a class (or metaclass) which will allow me to specify the different acceleration terms for a given instance, something along the lines of:
# please notice this is meant as "pseudo-code"
def some_acceleration(t):
return (1*t, 2*t, 3*t)
def some_other_acceleration(t):
return (4*t, 5*t, 6*t)
S = Spacecraft()
S.Acceleration += someacceleration + some_other_acceleration
In this case, the instance S would default to, say, two acceleration terms and I would add a the other two terms I want: some acceleration
and some_other_acceleration
; they return a vector (here represented as a triplet). Notice that in my "implementation" I ve overloaded the +
operator.
This way the algorithms will be designed for an abstract "spacecraft" and all the actual force fields will be provided on a case-by-case basis, allowing me to work with simplified models, compare modeling methodologies, etc.
How would you implement a class or metaclass for handling this?
I apologize for the rather verbose and not clearly explained question, but it is a bit fuzzy in my brain.
Thanks.