# Connecting Mathematica and Raku

## Introduction

Connecting Mathematica and Raku allows facilitating and leveraging some interesting synergies between the two systems.

In this document we describe several ways of connecting Mathematica and Raku:

1. Using external, operating system runs
2. Using in-process communication sockets
3. Using a Web service

1. Encoders and decoders (for both Mathematica and Raku expressions)
2. The making of notebook Raku-style and Raku-cells
3. The making and utilization of Domain Specific Language (DSL) cells

Remark: In this document I use Mathematica and Wolfram Language (WL) as synonyms. If we have to be precise, we could say something like “Mathematica is the software system and WL is the backend programming language in the software system.”

### Preliminary examples

Here is an example of a Raku cell:

``````say (1+1_000)**2

(*"1002001"*)``````

Here is an example of a Domain Specific Language (DSL) cell that does parsing and interpretation only (generates code, does not evaluate it):

``````DSL MODULE DataQuery;
use dfTitanic;
group by passengerSex;
show counts``````
``````obj = dfTitanic;
obj = GroupBy[ obj, #["passengerSex"]& ];
Echo[Map[ Length, obj], "counts:"]``````

(Below we provide more detailed examples.)

### Why is this useful?

Here are some Mathematica-centric reasons about the usefulness of connecting Mathematica and Raku:

• Raku has built-in UTF-8 symbols treatment and bignums, hence it is interesting to compare its computational model with that of Mathematica or other external evaluators that Mathematica supports.
• In my view Raku is the only “true” potential competitor of Mathematica that is not a LISP descendant.
• I plan to discuss this in another document, not in this one.
• Raku has a great built-in system of grammars and interpreter actions, which can be used to complete, replace, or amplify WL’s built-in functionalities.
• The utilization of a constellation of DSL packages for code generation from the ”Raku for Prediction” project.
• I admit, this is a very biased and personal reason.

Here are some Raku-centric reasons about the usefulness of connecting Mathematica and Raku:

• Mathematica is the most powerful mathematical software system and has one of the oldest, most mature notebook solutions.
• Using notebooks facilitates interactive development or research (in general and with Raku.)
• The ability to visualize, and plot results derived with Raku.
• Combining evaluations with other programming languages.
• Other programming languages that can be run in Mathematica notebooks: Python, R, Julia, etc.
• It is a great way to demonstrate the ideas and abilities of the “Raku for Prediction” project.
• Literate programming.
• Comparative testing of results correctness:
• Verifying that new Raku implementations do “the right thing”
• Comparison with other languages “doing the same thing”

### Orientation mind-map A list of short descriptions of the sections below and their importance follows. (The important sections are written with bold font weight; the not important ones in bold font weight and italic slant.)

• “The journey” outlines my development efforts to make Raku available into Mathematica notebooks and other types of documents.
• “RakuMode” describes the use of Raku cells in notebooks.
• “DSLMode” describes the use of DSL cells that utilize Raku evaluations.
• “Web service” describes the web service programmed in Raku that leverages the use of the Wolfram Engine for generating code through a NLP Template Engine.
• “Encoders and decoders” discusses the programming and application of Raku and WL encoders.
• “Example: Numeric word forms” shows how Raku package commands can be used to parse integer names generated with WL built-in commands. (A “synergy” demo.)
• “Example: Stoichiometry” shows how Raku package commands can be used to retrieve chemical elements data and balance chemical equations, and how that compares to WL’s built-in functionalities. (A “comparison” demo.)
• “Making of the DSL cells” discusses how the DSL cells (style data) were programmed.
• “Making of the Raku cell” discusses how the Raku cell (style data) and “prefix” icon was programmed.
• “Future plans” outlines future plans for related development efforts.

### Execution

This notebook can be executed, but Raku have to be installed for that. The function `StartRakuProcess` takes the option setting `Raku->"some/path/to/raku"` that allows to specify where the Raku executable is. (To get Raku see https://raku.org or https://rakudo.org .)

## The journey

This diagram summarizes my “Raku connectivity” journey:

``plJourney = ImageCrop@Import["https://github.com/antononcube/RakuForPrediction-book/raw/main/Diagrams/Raku-hook-up-to-notebooks-journey.jpg"]`` Here is some narration:

1. I developed a dozen of Raku DSL packages that can translate natural language specifications into executable programming code in different languages.
2. Initially I used simple Operating System (OS) redirection calls to get the code generated by the DSL packages from specialized DSL-mode cells.
1. For example, the WTC-2020 presentation “Multi-language Data-Wrangling Conversational Agent”, [AAv1], used that mechanism.
2. At that point I developed the Python, R, and WL packages with names “ExternalParsersHookup”.
3. I also implemented related Raku-mode and DSL-mode notebook styles for Mathematica.
3. The initial approach had two significant problems:
1. The Raku and DSL cells did not “keep state” between each other – each cell was executing code on its own.
2. The evaluation was slow, since every time Raku had to be started and the corresponding packages loaded.
4. The want to “do it right” raised a fair amount of questions; the main ones are:
1. How to start a resident process from within, say, Mathematica?
2. How to establish a connection to that process?
3. What are the applicable (and “standard”) software components or solutions for that kind of architectures?
5. Turned out Mathematica was fairly well equipped to do these kind of inter-process connections.
1. See the guides:
2. I was aware of many of the external process WL functionalities, but SocketConnect had the relatively recent addition of ZeroMQ (in 2017) which I was not aware of.
6. Research and reading for possible solutions.
1. Considered JVM architectures and org-mode related solutions.
2. The Babel org-mode solution I found for evaluating Raku source blocks was very nice and insightful to read, but essentially the same as my first Mathematica-to-Raku connection solution.
7. After some research and reading I decided to use ZeroMQ.
1. ZeroMQ is used by other external evaluators in Mathematica (explained well in the documentation.)
2. The ZeroMQ documentation is fun to read and has examples in multiple programming languages.
8. It seemed better to generalize the problem and develop a Raku module for sandboxed Raku execution.
1. The evaluation cells are now the code cells in notebooks, or Markdown, org-mode, or Raku Pod6 files.
2. The sandboxed Raku can have a “persistent” context that is accessed by the evaluation cells.
3. Implemented the Raku package “Text::CodeProcessing”, [AAp9].
4. Studied the work on connecting Raku to Jupyter by Brian Duggan, [BD1].
9. Implemented the corresponding WL packages that utilize the Raku package “Text::CodeProcessing”.
1. Made several versions of those implementations connecting Raku to R, Python, and Wolfram Engine.
2. The Raku-to-Wolfram-Engine connection was used in dedicated Web services.

## RakuMode

Here we load the “RakuMode.m” package:

``Import["https://raw.githubusercontent.com/antononcube/ConversationalAgents/master/Packages/WL/RakuMode.m"]`` Remark: The package “RakuMode.m” is very lightweight code-wise. The only “large part” is the Camelia icon for the Raku evaluation cells.

We convert notebooks into Raku-mode with this command:

``RakuMode[]``

In Raku-mode we have Raku cells that allow evaluation of Raku code (within Mathematica notebooks.)

Raku-mode cells execute Raku code via either:

• RunProces
• The socket connection functions BinaryWrite, SocketReadMessage, and ByteArrayToString

### No ZeroMQ connections

Without ZeroMQ sockets “RakuMode.m” uses the (very lightweight) package “RakuCommand.m”. Here is an example:

``````say(1+1_000)

# 1001``````

Let us make an intentional omission in order to illustrate that RunProcess is used:

``1+1_000`` We get the message above because we essentially executed the shell command:

``raku -e "1_1_000"``   ### Via ZeroMQ

Using Raku cells that do Raku evaluations over ZeroMQ sockets is a primary use case of the packages described in this document. ZeroMQ is used in WL for other external evaluators (Python, Julia, etc.)

First we start a Raku process:

``StartRakuProcess[]`` Here we create a Raku cell – using the shortcut “Shift-|” – and specify the loading of the package ”Lingua::NumericWordForms”:

``````use Lingua::NumericWordForms

# "(Any)"``````

Here we define a variable and assign to it an array of numeric word forms (in Bulgarian, English, and Spanish):

``````my @nforms = [‘двеста осемдесет и седем’, ‘two hundred and five’, ‘ochocientos setenta y dos’];

# [двеста осемдесет и седем two hundred and five ochocientos setenta y dos]``````

Here we parse-and-interpret several numeric word forms into numbers (and show the corresponding languages):

``````from-numeric-word-form(@nforms):p

# (bulgarian => 287 english => 205 spanish => 872)``````

The last Raku cell uses a function provided by the package in the first cell and a variable defined in the second cell. In other words, there is a common Raku context that is accessed by those cells.

### Flow chart walkthrough

Let us provide a schematic description of the example in the previous sub-section. The following flow chart summarizes the creation and evaluation of Raku cells:

``ImageCrop@Import["https://github.com/antononcube/RakuForPrediction-book/raw/main/Diagrams/Raku-execution-in-Mathematica-notebook.jpg"]`` Here is a narrative for the flow chart above:

1. The user converts the Mathematica notebook into RakuMode and starts a Raku process with StartRakuProcess
1. StartRakuProcess uses StartProcess to start Raku
2. Socket connection is established with the Raku process though ZeroMQ
2. The Raku process loads the package ”Text::CodeProcessing”
1. That package is used to start a sandboxed Raku environment
2. The sandboxed Raku environment can be seen as REPL that has its own context
3. The user makes a Raku cell and enters Raku code
4. The user triggers the evaluation of the cell
5. The cell content evaluation is done with the function RakuInputExecute
1. Raku code is converted to a binary array and sent through a ZeroMQ socket to Raku REPL
2. Raku REPL evaluates the code
3. The result is send back to WL through the ZeroMQ socket
6. The result of the Raku cell evaluation is placed in the notebook as an output cell

Remark: In the flow chart there is an optional application of the Mathematica and Raku encoders and decoders. The examples below provide more details.

## DSLMode

Here we load the DSLMode package (which triggers the loading of other packages for different computational workflows):

``Import["https://raw.githubusercontent.com/antononcube/ConversationalAgents/master/Packages/WL/DSLMode.m"];``         Here we convert the notebook into DSL-mode:

``DSLMode[]``

Here we start a Raku process if we have not started one already:

``(*StartRakuProcess[]*)``

Here we use natural language commands to specify a data wrangling workflow in a DSL parsing cell:

DSL TARGET Python::pandas; include setup code; load the dataset iris; group by the column Species; show counts

``````import pandas
from ExampleDatasets import *

obj = example_dataset('iris')
obj = obj.groupby(["Species"])
print(obj.size())

# Species
# setosa        50
# versicolor    50
# virginica     50
# dtype: int64``````

Remark: Evaluating the DSL cell above produces a Python cell, which was then manually evaluated.

In the next example we use a DSL execution cell, but in order to see the DSL parser-interpreter result we are going to change the method option of the function DSLInputExecute to print the generated code before execution:

``````SetOptions[DSLInputExecute, Method -> "PrintAndEvaluate"]

(*{Method -> "PrintAndEvaluate"}*)``````

Here is a DSL evaluation cell with the same data wrangling workflow specification as above except the target language is Raku:

DSL TARGET Raku::Reshapers; include setup code; load the dataset mtcars ; group by the column cyl; show counts

``````use Data::Reshapers;
use Data::Summarizers;
use Data::ExampleDatasets;

my \$obj = example-dataset('mtcars') ;
\$obj = group-by( \$obj, "cyl") ;
say "counts: ", \$obj>>.elems

# counts: {4 => 11, 6 => 7, 8 => 14}``````

Remark: This time the generated code was automatically evaluated when the DSL cell was evaluated.

Remark: The DSL parsing cell has light blue background, the DSL evaluation cell has light yellow background.

Here we can see the output from Raku before it is post-processed in ExternalParsersHookup`ToDSLCode:

``````res =
ToDSLCode["DSL TARGET Python::pandas;load the dataset mtcars ;group by the column cyl;show counts", Method -> Identity];
ResourceFunction["GridTableForm"][List @@@ Normal[KeySort@res], TableHeadings -> {"Key", "Value"}]`````` The underlying Raku function is ToDSLCode from the package ”DSL::Shared::Utilities::ComprehensiveTranslation”:

``````ToDSLCode('DSL TARGET Python::pandas;
group by the column cyl;
show counts')

{CODE => obj = example_dataset('mtcars')obj = obj.groupby([\"cyl\"])print(obj.size()),
COMMAND => DSL TARGET Python::pandas;load the dataset mtcars ;group by the column cyl;show counts,
DSL => DSL::English::DataQueryWorkflows,
DSLFUNCTION => proto sub ToDataQueryWorkflowCode (Str \$command, Str \$target = \"tidyverse\", |) {*},
DSLTARGET => Python::pandas,
USERID => }``````

Remark: By default Raku’s ToDSLCode returns a hash. WL’s ToDSLCode returns an association with Method->Identity.

## Web service

We can provide a Web service via the constellation of Raku libraries Cro that translates natural language DSL specifications into executable code. See the video [AAv4] for a demonstration of such a system. Below we refer to it as the Cro Web Service (CWS).

### Getting template code

Here is an example of using CWS through Mathematica’s web interaction function `URLRead`, [WRI3], in order to get the R code of Latent Semantic Analysis (LSA) workflow:

``````command = "use aAbstracts; make document term matrix;apply LSI functions IDF, None, Cosine; extract 40 topics using method SVD;echo topics table" // StringTrim;
res = Import@URLRead[<|"Scheme" -> "http", "Domain" -> "accendodata.net", "Port" -> "5040", "Path" -> {"translate", "qas"}, "Query" -> <|"command" -> command, "lang" -> "R"|>|>];``````
``ResourceFunction["GridTableForm"][List @@@ ImportString[res, "JSON"], TableHeadings -> {"Key", "Value"}]`` The code was obtained by using a LSA template, the slots of which were filled-in by utilizing a Question Answering System (QAS). See the project “NLP Template Engine” and the movie “NLP Template Engine, Part 1”.

Remark: The QAS utilized in this implementation is based on WL’s function FindTextualAnswer.

### Schematic overview

Here is a components diagram of the process utilized above:

``ImageCrop@Import["https://github.com/antononcube/RakuForPrediction-book/raw/main/Diagrams/DSL-Web-Service-via-Cro-with-WE-QAS.jpg"]`` The components are (left to right):

• Any notebook or Integrated Development Environment (IDE) editable file.
• Web service API for CWS
• Web service run on a certain server
• CWS that is up and running
• Up and running Wolfram Engine to which CWS connects via ZeroMQ

Here is some narrative for getting DSL translation code by the NLP Template Engine:

1. In a notebook invoke a call to CWS
2. CWS uses the “resident” process implemented in Raku (using the family of libraries Cro)
3. CWS connects to Wolfram Engine through ZeroMQ
4. Wolfram Engine uses the packages of NLP Template Engine to fill-in the slots of relevant code templates
• The relevant templates a guessed by a Machine Learning classifier.
5. The result is given back to the notebook

## Encoders and decoders

### Setup

Here we load a WL package with a decoder of Raku expressions:

``Import["https://raw.githubusercontent.com/antononcube/ConversationalAgents/master/Packages/WL/RakuDecoder.m"]``

Here we load a WL package with an encoder to Raku expressions:

``Import["https://raw.githubusercontent.com/antononcube/ConversationalAgents/master/Packages/WL/RakuEncoder.m"]``

Here we load a Raku package that has functions to encode Raku objects into WL expressions:

``````use Mathematica::Serializer;

(*"(Any)"*)``````

### Encoding to Raku

Here we create a small random dataset:

``````SeedRandom;
dsRand = ResourceFunction["RandomTabularDataset"][{4, 3}]`````` Here is how the dataset looks encoded in Raku (an array of hashes):

``````ToRakuCode[dsRand]

(*"[%('agree'=>-7.08447,'prankster'=>'while','rapture'=>-34),%('agree'=>-8.28714,'prankster'=>'extreme','rapture'=>86),%('agree'=>1.69727,'prankster'=>'esprit','rapture'=>-2),%('agree'=>9.38245,'prankster'=>'wintergreen','rapture'=>52)]"*)``````

Here we assign the encoded dataset to a Raku variable:

``RakuInputExecute["my @dsAWs = " <> ToRakuCode[dsRand]];``

Here is the dataset tabulated in Raku:

``````use Data::Reshapers;
say to-pretty-table(@dsAWs)

# +---------+-------------+-----------+
# | rapture |  prankster  |   agree   |
# +---------+-------------+-----------+
# |   -34   |    while    | -7.084470 |
# |    86   |   extreme   | -8.287140 |
# |    -2   |    esprit   |  1.697270 |
# |    52   | wintergreen |  9.382450 |
# +---------+-------------+-----------+``````

### Decoding from Raku

In the following Raku cell we encode the result as a WL expression:

``````@dsAWs==>encode-to-wl()

(*"WLEncoded[List[Association[Rule[\"rapture\",-34],Rule[\"prankster\",\"while\"],Rule[\"agree\",Rational[-708447,100000]]],Association[Rule[\"rapture\",86],Rule[\"agree\",Rational[-414357,50000]],Rule[\"prankster\",\"extreme\"]],Association[Rule[\"agree\",Rational[169727,100000]],Rule[\"rapture\",-2],Rule[\"prankster\",\"esprit\"]],Association[Rule[\"rapture\",52],Rule[\"agree\",Rational[187649,20000]],Rule[\"prankster\",\"wintergreen\"]]]]"*)``````

Here the string above is converted to a WL expression:

``ToExpression[%] // First`` Here we set Raku cells to use the decoding function FromRakuCode if the head of the expression is WLEncoded:

``````SetOptions[RakuInputExecute, Epilog -> FromRakuCode]

(*{"ModuleDirectory" -> "", "ModuleName" -> "", "Process" -> Automatic, Epilog -> FromRakuCode}*)``````

Let us evaluate the previous Raku cell again (we get a WL Dataset object right away):

``@dsAWs==>encode-to-wl()`` ## Example: Numeric word forms

We showed some examples of parsing numeric word forms in the section “RakuMode”. In this section we extend those examples in order to demonstrate how Raku can be used in Mathematica to extend or replace existing functionalities or provide missing ones.

Here we make an association of numbers and their numeric word forms in English, then tabulate that association:

``````SeedRandom;
aNumericWordForms = KeySort@Association[# -> IntegerName[#, {"English", "Words"}] & /@ Join[RandomInteger[10^4, 3], RandomInteger[10^7, 3]]];
ResourceFunction["GridTableForm"][aNumericWordForms]`````` Here make another number-to-word-form association using multiple languages:

``````SeedRandom;
aNumericWordForms2 = KeySort@Association[# -> IntegerName[#, {RandomChoice[{"Bulgarian", "Japanese", "Spanish"}], "Words"}] & /@ Join[RandomInteger[10^4, 3], RandomInteger[10^7, 3]]];
ResourceFunction["GridTableForm"][aNumericWordForms2]`````` Mathematica (Version 13) can parse English numeric word forms

``````SemanticInterpretation /@ aNumericWordForms

(*<|16 -> 16, 4286 -> 4286, 5481 -> 5481, 29695 -> 29695, 8224333 -> 8224333, 9537119 -> 9537119|>*)``````

But, at this point Mathematica (Version 13) cannot parse numeric word forms in other languages:

``````SemanticInterpretation[Values[aNumericWordForms2]]

(*{\$Failed, \$Failed, \$Failed, \$Failed, \$Failed, \$Failed}*)``````

The Raku package “Lingua::NumericWordForms” recognizes (automatically) a dozen of languages. Here is an example:

``````from-numeric-word-form(['two hundred and five', 'двеста осемдесет и седем']):p

# (english => 205 bulgarian => 287)``````

Here is example of Raku-parsing of the values of the association created above:

``````RakuInputExecute["from-numeric-word-form(" <> ToRakuCode[Values[aNumericWordForms2]] <>"):p"]

(*"(japanese => 16 bulgarian => 4286 spanish => 5481 japanese => 29695 spanish => 8224333 japanese => 9537119)"*)``````

### Performance

Raku converts faster than WL numeric word forms into numbers. Here is the WL timing:

``````AbsoluteTiming[
SemanticInterpretation[Values[aNumericWordForms]]
]

(*{0.786108, {1131, 7504, 9970, 1016128, 5341656, 6865271}}*)``````

Here is the Raku timing:

``````AbsoluteTiming[
RakuInputExecute["from-numeric-word-form(" <> ToRakuCode[Values[aNumericWordForms]] <>")"]
]

(*{0.226441, "(1131 7504 9970 1016128 5341656 6865271)"}*)``````

## Example: Stoichiometry

In this section we make a brief comparison of Mathematica and Raku over chemical elements data retrieval, molecular mass calculations, and chemical equation balancing.

In 2007, while working on WolframAlpha (W|A) , I wrote the first versions of W|A’s chemical molecules parser and functions for molecular mass calculations and chemical equation balancing. (See the raw chapters in [AAr3].) In the beginning of 2021 I wrote similar functions for Raku, see the package ”Chemistry::Stoichiometry”, [AAp13].

Mathematica Version 6.0 (released in 2007) introduced the function `ElementData`. Mathematica Version 13.0 (released December 2021) has the function `ReactionBalance` that balances chemical equations.

Here we load the Raku package [AAp13]:

``use Chemistry::Stoichiometry;``

### Chemical element data

Here we get element data for Chlorine:

``````chemical-element-data(‘Cl’);

# {Abbreviation => Cl, AtomicNumber => 17, AtomicWeight => 35.45, Block => p, Group => 17, Name => chlorine,
#  Period => 3, Series => Halogen, StandardName => Chlorine}``````

Mathematica has a much larger list element properties:

``````ElementData["Cl", "Properties"] // Length

(*86*)``````

But let us get the properties that Raku has:

``````lsProps = {"Abbreviation", "AtomicNumber", "AtomicWeight", "Block", "Group", "Name", "Period", "Series", "StandardName"};
Map[# -> ElementData["Cl", #] &, lsProps]`````` Both Mathematica and Raku know the full names of the chemical elements, but Raku has multi-language support. Here we use Raku to retrieve the names of Chlorine in different languages using the abbreviation “Cl”:

``````<Bulgarian English German Japanese Persian Russian Spanish>.map({ \$_ => chemical-element(‘Cl’, \$_ ) })

# (Bulgarian => Хлор English => Chlorine German => Chlor Japanese => 塩素 Russian => Хлор Spanish => Cloro)"*)``````

Here we get different types of data using Japanese, English, and Russian element names:

``````[atomic-weight(‘ガリウム’), chemical-element-data(‘oxygen’):weight, chemical-element-data(‘кислород’):abbr]

# [69.723 15.999 O]``````

The Japanese name “ガリウム” above is for Gallium:

``````chemical-element-data(‘ガリウム’)

# {Abbreviation => Ga, AtomicNumber => 31, AtomicWeight => 69.723, Block => p, Group => 13, Name => gallium,
#  Period => 4, Series => PoorMetal, StandardName => Gallium}``````

### Molecular mass

Here we assign a molecule formula (of “diphenyliodonium bromide”) to a variable in Raku:

``````my \$molecule=‘(C6H5)2IBr’

# (C6H5)2IBr``````

Here using Mathematica we find the molecular mass:

``ChemicalFormula[StringTrim[%]]["MolecularMass"]`` Here is the molecular mass computed with Raku:

``````molecular-mass(\$molecule)

# 361.02047000000005``````

In both cases the molecule formula is parsed into pairs of element names and multipliers and for each pair the mass is computed using the corresponding element atomic mass; then the masses corresponding to all pairs are totaled.

### Chemical equation balancing

Chemical equation balancing can be done representing the molecules in the equation as points of a vector space, and then solving a corresponding system of linear equations.

Here we assign to a Raku variable a chemical equation string:

``````my \$chemEq=‘C2H5OH + O2 = H2O + CO2’;

# "C2H5OH + O2 = H2O + CO2"``````

Here we balance the equation with Mathematica:

``ReactionBalance[%]`` Here we balance the equation with Raku:

``````balance-chemical-equation(\$chemEq)

# [1*C2H5OH + 3*O2 -> 2*CO2 + 3*H2O]``````

(We can see that results are the same.)

## Making of the DSLMode cells

Initially I borrowed ideas from the Wolfram Function Repository (WFR) function “DarkMode”. After that I was pretty much directed by Kuba Podkalicki how to design and implement the functionalities behind the DSL cells.

The package “DSLMode.m” is small, just “a front” for the pretty big package “ExternalParsersHookup.m”, which has all DSLs from the RakuForPrediction project being represented in it.

DSL cell has a hard copy of the Raku cell style data from “RakuMode.m” in order to have only one package dependency (that of “ExternalParsersHookup.m” .)

## Making of the RakuMode cell

I tried to make the Raku cells to resemble external evaluation cells to a large degree, but still look decisively different.

• As background cell color I used background color in the input/output cells in the Raku site documentation.
• Finding and making the “prefix” cell icon took a fairly long time.
• The package “RakuMode.m” had several revisions.
• Except the Raku cell itself, probably the most interesting or peculiar part of the package “RakuMode.m” is that it has the ZeroMQ Raku code it uses as a WL string template.

### Cell icon

Here are some Camelia logos:

``````ResourceFunction["GridTableForm"][{{
Import["https://raw.githubusercontent.com/uzluisf/metamorphosis/master/hex_camelia/perl6-color-logo1.png"]}},
TableHeadings -> {"Standard", "Hex"}, Background -> None]`````` Here is the graphics object used to make the image of the Camelia icon used for Raku cells:

``````Import["https://raw.githubusercontent.com/antononcube/ConversationalAgents/master/Packages/WL/HexCameliaIcons.m"];
GetHexCameliaGraphics[]`````` Remark: A selected GitHub repository image was croped and vectorized with ImageGraphics. For more details see the descriptions in the package “HexCameliaIcons.m”, [AAp3].

### ZeroMQ template code

Here is the Raku code string template for the ZeroMQ connection:

``Magnify[RakuMode`Private`zmqScript, 0.8]`` ### Comparison with other external evaluation cells

Here is how the Raku cell looks compared to other external evaluator cells:

``1+1_000``
``4^3``
``1+1_000``
``{seq(1,10,2)}``

(The order of the cells above is Raku, Julia, Python, R.)

## Future plans

Here are future plans that (mostly) directly related to the Mathematica-and-Raku connectivity functionalities discussed above:

• More robust WL to Raku encoder.
• More robust ZeroMQ connection maintenance and support.
• Raku slang for Wolfram Language.
• Raku slang for DSL workflows.
• Making similar connections from Raku to other languages (Python, Julia).

### Functions

[WRI1][ Wolfram Research, (2014), RunProcess, Wolfram Language function, https://reference.wolfram.com/language/ref/RunProcess.html.

[WRI2] Wolfram Research, (2014), StartProcess, Wolfram Language function, https://reference.wolfram.com/language/ref/StartProcess.html.

### Mathematica packages

[AAp1] Anton Antonov, DSLMode Mathematica package, (2020-2021), ConversationalAgents at GitHub/antononcube.

[AAp2] Anton Antonov, ExternalParsersHookup Mathematica package, (2020-2021), ConversationalAgents at GitHub/antononcube.

[AAp3] Anton Antonov, HexCameliaIcons Mathematica package, (2021), ConversationalAgents at GitHub/antononcube.

[AAp4] Anton Antonov, RakuCommand Mathematica package, (2021), ConversationalAgents at GitHub/antononcube.

[AAp5] Anton Antonov, RakuDecoder Mathematica package, (2021), ConversationalAgents at GitHub/antononcube.

[AAp6] Anton Antonov, RakuEncoder Mathematica package, (2021), ConversationalAgents at GitHub/antononcube.

[AAp7] Anton Antonov, RakuMode Mathematica package, (2020-2021), ConversationalAgents at GitHub/antononcube.

### Raku packages

[AAp9] Anton Antonov, Text::CodeProcessing, (2021), Raku Modules.

[AAp10] Anton Antonov, Lingua::NumericWordForms, (2021), Raku Modules.

[AAp11] Anton Antonov, Data::Reshapers, (2021), Raku Modules.

[AAp12] Anton Antonov, DSL::English::DataQueryWorkflows Raku package, (2020), GitHub/antononcube.

[AAp13] Anton Antonov, Chemistry::Stoichiometry, (2021), Raku Modules.

### Repositories

[AAr1] Anton Antonov, “Raku for Prediction” book, (2021), GitHub/antononcube.

[AAr2] Anton Antonov, NLP Template Engine, (2021), GitHub/antononcube.

[AAr3] Anton Antonov, “Mathematica for Chemists and Chemical Engineers” book project, (2020), GitHub/antononcube.

[BD1] Brian Duggan et al., p6-jupyter-kernel, (2017-2020), GitHub/bduggan.

### Videos

[AAv1] Anton Antonov, “Multi-language Data-Wrangling Conversational Agent”, WTC-2020 presentation, (2020),Wolfram at YouTube.

[AAv2] Anton Antonov, “Raku for Prediction”, (2021), The Raku Conference.

[AAv3] Anton Antonov, “NLP Template Engine, Part 1”, (2021), Simplified Machine Learning Workflows at YouTube.

[AAv4] Anton Antonov, “Doing it like a Cro (Raku data wrangling Shortcuts demo)”, (2021), Simplified Machine Learning Workflows at YouTube.