The basic idea here is to leverage our tool chain as flexibly as possible. Because go and C both in the end can generate abstract syntax trees which can generate code into a standard way of representing code generated by compilers, LLVM. In fact, any syntax that we can parse into a golang ast.*(Expr|Stmt|...), we can send to the code generator to be consumed as input to the rest of the toolchain.
I will attempt a proof of concept (POC) coding sprint to see how to parse go from strings to ast's, and then generate code. I will then be able to demonstrate the ability to parse a DSL (Domain Specific Language) composed of fragments of go joined by extended syntax. Subsets including all expression, literal and compound statement syntax can be mixed with DSL declarations that mimick golang declarations, but have semantics designed for your context. In the context of holochains, I want a streamlined syntax to define a nucleus syntax with more streamlined syntax that is more language independent.
The broader context is a new architecture for ceptr. This approach will give us a great deal of flexibility both to create the toolchain to compile ceptr/holochain applications directly from multilingual components. Pcubed will be able to use these flexible toolchains to generate basic procedural code as in C and go with Semtrex code. Semtrex code can borrow extensively from go, but also directly generate LLVM targetted to a Semtrex VM. The trees and protocols compile directly into runable codes via LLVM consuming code emmitters. I think that in ceptr/pcubed some abstract trees are actually data rather than code, but that doesn't mean they can't be compiled the same way such that they initialize native language data objects.