Skip to content

Hybrid model#771

Open
maedoc wants to merge 26 commits intomasterfrom
hybrid-model
Open

Hybrid model#771
maedoc wants to merge 26 commits intomasterfrom
hybrid-model

Conversation

@maedoc
Copy link
Member

@maedoc maedoc commented Jan 20, 2026

Reopening because of accidental deletion of #743 , copying desc here:

This is a another take on hybrid models, where multiple models are used at once, which is a more array oriented than #739 and closer to TVB style with traits etc. It introduces the following elements to the TVB datatypes (naming suggestions welcome)

  • subnetwork: compromises a set of nodes + neural mass model + scheme + monitors
  • projection: connects a cvar from a source subnet to a cvar in a target subnet
  • networkset: collects subnetworks and projections into a coherent whole where we can
    • compute the coupling across all projections
    • compute all dfuns
    • apply an integration scheme to all dfuns to take a step in time

From there, it turns out that flexible state shapes creates surprise in many places, where it's assumed that the state variables are homogenous across nodes. For instance, the integrator noise is set per state variable, however in different models, these need to be set with different values.

Other major nuances are related to monitors, but I think some straightforward compromises are available: for "debug" monitors like raw or temporal average, the monitor will be applied per model to produce a per-subnetwork trace. Monitors more "neuroimaging" like will choose a particular state variable and apply the forward model (gain matrix, BOLD equations) to it.

Stimulation is another open question, but I wonder if, deparating from the monolithic style, a stimulus cannot just be a "subnetwork" with a projection to the nodes being stimulated? Ideas welcome.

Performance is terrible w/ ~5-10k iter/s on default connectome w/ two models, but this is expected w/ the naive implementation and will improve dramatically after massaging for jax or a c++ impl.

recent feedback

still some bugs and api is not intuitive from Frederico Tesler and Roberta Lorenzi.

implementation status

  • all monitors work
  • all integrators work
  • time delays implemented
  • source/target cvars are arrays not scales
  • make it faster
  • subnetwork can have internal projection(s)
  • apis are documented
  • stimulation is supported
  • afferent coupling monitors (fix tests)
  • add tests against existing simulation scenarios
  • toy scenarios for tests integrated
  • better demos
  • clean up API for connectivity
  • consider re-ordering global monitor output to follow original connectome instead of concatenated subnetworks
  • improve coupling API
    • named cvars instead of indices
    • including support for more cvars than svars (cvar=[0,0,0] etc)
    • cfuns
  • ensure integrators call reconfigure_boundaries_and_clamping...
  • check initial conditions are done correctly as in regular tvb
  • for given interprojection w/ multiple target cvars, we should have scaling value per target

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant