Week 7

Both spectrograms are now being rendered in external ‘jit.windows’. Inside both of their respective mapping abstractions and the global ‘circleclickpoint_&_line_alphablend’ abstraction I have replaced mapping abstraction dimension inverting, colour inverting and the alphablend with openGL equivalents. I have also replaced all instances of the jit.matrix with ‘jit.gl.slab’ where I can.

Objectives for the week:

1. Figure out a user-friendly way to interact with the spectrograms for time, amplitude and frequency information.
2. Find out how to close one spectrogram when using the other (this will speed up the project as a whole!).
3. Begin to integrate more spectral manipulation effects in as much openGL as possible.
4. Implement feature where all the DSTFT data is read into the matrix in one operation.

I realised that part of the reason why the patch was performing so slowly (mentioned in the semester 2 week 6 post on the 25/2/19) was because I had certain parts of the ‘jit.window’ rendering process (‘jit.window’ render rate, and line segment refresh rate in the alphablend) updating unnecessarily quickly – I have slowed them all down and the patch is running smoothly again.

Going to investigate using ‘poly~’ to encapsulate each spectral processing routine with the resultant processed matrix still being displayed in a ‘jit.pwindow’. ‘poly~’ is proving to be very problematic for processing openGL: rendering contexts are not being established when the ‘poly~’ is opened. ‘horizontal_sync~’ is also having trouble being sent inside the ‘poly~’, meaning that the red line playback point in each spectrogram is not syncing correctly with the audio.

Leave a comment

Your email address will not be published. Required fields are marked *


*