patch programming

Science zone

Dr Darren of Webel originally trained as a computational physicist and applied mathematician, performed research from 1988 to 1993 in radio astronomy and astrophysics, and worked as a scientific computing expert and particle accelerator physicist from 1993 to 1999, as well as working on numerous science and education projects after establishing the Webel IT Australia Scientific IT Consultancy in 2000. You can find out more about his science career at: Dr Darren Kelly's full-career Curriculum Vitae.

From Wikipedia: Computational Physics (Aug 2016):

Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science.

It is sometimes regarded as a subdiscipline (or offshoot) of theoretical physics, but others consider it an intermediate branch between theoretical and experimental physics, a third way that supplements theory and experiment.

This zone features various (mostly archival and historical) science projects, many of which demonstrate applications of the model-based software engineering and systems engineering technologies promoted on this site and offered as Webel services.

HERA particle accelerator: electron Beam Loss Monitor lifetime disruption plots
Example of numerical integation and visualisation of a differential equation in the Maple symbolic algebra system
Maple 3d plot animation example
Maple example: symbolic algebra equation and numerical solution
HERA particle accelerator: custom data analysis application
CT scan slice: visualisation example: 1
CT scan slice: visualisation example: 2
MOST radiotelescope: Java3D animation: steering (9.8M)
Figure 2: A diagram of MOST with the numbering system used in this  thesis report (1988)
Figure 3: MOST radiotelescope: A diagram of the coordinate system used in the report (1988)
Figure 10: the MOST radiotelescope synthesised beam
Figure 1: MOST radiotelescope "skymap" from observation of a strong point source at field centre
Figure 11: Model: UML2 composite structure diagram of the monochromator assembly
Figure 09: Model: bunker shield assembly for the Platypus reflectometer as "wrapped block" class diagram.
Figure 10: Model: UML2 composite structure diagram for the monochromation beam stage of the neutron diffractometers of the OPAL NBIs.
Figure 12: Model: UML2 composite structure diagram of the monochromator stage assembly with motorised goniometer rotation, tilt, and translation stages, which are driven by encoded devices.
Figure 13: Model: wrapped block class diagram (software engineering view) for the entire monochromation beam ("logical") stage.

Contents of: Science zone



A Drancel is the virtual synthesis 3D "atom" of the Drancing accelerometer music system (where "music" means here both real-time sound and light synthesis).

A Drancel is designed to work with ANY triaxial accelerometer (of which the WiiTM Remote is a convenient example, thanks to its leverage of Bluetooth™ wireless technology).

It is the calibrated, conditioned, "homed", virtual Drancel that is considered to synthesise, not the triaxial accelerometer, which is merely a source of (X,Y,Z) acceleration signals that are mapped to synthesis channel triads and (R,G,B) (or other) light components.

A Drancel RGB (drancing light element) is to Drancing synthesis what a "pixel" (picture element) is to a picture.

Imagine hundreds of "Drancers" (Drancing performers), each with 5 Drancel RGB units, each synthesising sound and light, and you get the picture ! Can you hear it ? Can you see it ? That's the Drancing vision !

Drancels can act as independent synthesis units or they can be combined (multiplexed) to create arbitrarily complex syntheses. In this PureData prototype there are, however, only 2 Drancels, for 2 hand-held $Wiimotes$ as wireless accelerometer sources. (The original Drancing accelerometer suit (since 1997) had 5 triaxial accelerometers in a "body star" pattern.)

Currently the PureData synthesis prototype offers the following elementary synthesis units:

  • AM: modulation of the amplitudes of a triad of "fixed-frequency" oscillators,
    which frequencies can be set by the user using the sliders.
  • VFO: variable frequency oscillators: the user can set the scale of frequency driving, a frequency offset, the overall gain of the oscillations, and whether or not the frequencies are discretised to lock-on to MIDI note frequencies (as opposed to arbitrarily fine continjous frequency variation).
  • Drums: simple sample drums (I currently use the free AudioPervert CR-78 drum samples) which will probably be replaced soon with completely synthetic PureData drums.



Not pretty, and not as flexible as a generative programming approach. It's just enough to separate the set of chosen samples (visit DrumSamples.pd) from the playing via triggers and/or radio buttons. It's a rapid prototype after all !



The "executable" 0th level Drancing.pd is just a wrapper for the PureData version of the DranceWare GUI for Drancing. It does not show any connections, so it acts more like a skin than the wrapped 1st level <a href="">DranceWare.pd</a>. Note the use of deep nesting of UI canvases from nested abstractions.



Very basic recording to a named file. It's a very rapid prototype !

The audio in the You do not have access to view this node was in fact recorded from system audio by a screencast capture system so that the modes and settings were also recorded visually simultaneously.



Drum samples are preloaded into tables for triggered playback.

They can be heard triggered once on selection from the <a href="">Drums.pd</a> radio buttons (with the slight side-effect that the selection process can be heard in recordings).

It's just enough for this prototype; it aint pretty, and it is bound to specific samples (adapted from the free AudioPervert CR-78 drum samples).

One could trigger samples other than drums and rely on samples for the entire music, however that is not the way of Drancing. Geez, why not just trigger your favourite CD to play from beginning to end, and say you're playing gestural music ?! It proves nothing. Only purist real-time synthesis provides realistic diagnostics and measures of the true performance power of evolving gestural synthesis instruments ! And it sounds much better, too.

Combining drum (or other) samples with real-time synthesis is an effective compromise.



Echo/delay and feedback can be very useful as it offers the performer a fixed timescale.

Syndicate content