Notebook transformations

Introduction

In this blog post we describe a series of different (computational) notebook transformations using different tools. We are using a series of recent articles and notebooks for processing the English and Russian texts of a recent 2-hour long interview. The workflows given in the notebooks are in Raku and Wolfram Language (WL).

Remark: Wolfram Language (WL) and Mathematica are used as synonyms in this document.

Remark: Using notebooks with Large Language Model (LLM) workflows is convenient because the WL LLM functions are also implemented in Python and Raku, [AA1, AAp1, AAp2].

We can say that this blog post attempts to advertise the Raku package “Markdown::Grammar”, [AAp3], demonstrated in the videos:

TL;DR: Using Markdown as an intermediate format we can transform easily enough between Jupyter- and Mathematica notebooks.


Transformation trip

The transformation trip starts with the notebook of the article  “LLM aids for processing of the first Carlson-Putin interview”, [AA1]. 

  1. Make the Raku Jupyter notebook
  2. Convert the Jupyter notebook into Markdown
    • Using Jupyter’s built-in converter
  3. Publish the Markdown version to WordPress, [AA2]
  4. Convert the Markdown file into a Mathematica notebook
  5. Publish that to Wolfram Community
    • That notebook was deleted by moderators, because it does not feature Wolfram Language (WL)
  6. Make the corresponding Mathematica notebook using WL LLM functions
  7. Publish to Wolfram Community
  8. Make the Russian version with the Russian transcript
  9. Publish to Wolfram Community
    • That notebook was deleted by the moderators, because it is not in English
  10. Convert the Mathematica notebook to Markdown
    • Using Kuba Podkalicki’s M2MD, [KPp1]
  11. Publish to WordPress, [AA3]
  12. Convert the Markdown file to Jupyter
  13. Re-make the (Russian described) workflows using Raku, [AAn5]
  14. Re-make workflows using Python, [AAn6], [AAn7]

Here is the corresponding Mermaid-JS diagram (using the package “WWW::MermaidInk”, [AAp6]):

use WWW::MermaidInk;

my $diagram = q:to/END/;
graph TD
A[Make the Raku Jupyter notebook] --> B[Convert the Jupyter notebook into Markdown]
B --> C[Publish to WordPress]
C --> D[Convert the Markdown file into a Mathematica notebook]
D --> E[Publish that to Wolfram Community]
E --> F[Make the corresponding Mathematica notebook using WL functions]
F --> G[Publish to Wolfram Community]
G --> H[Make the Russian version with the Russian transcript]
H --> I[Publish to Wolfram Community]
I --> J[Convert the Mathematica notebook to Markdown]
J --> K[Publish to WordPress]
K --> L[Convert the Markdown file to Jupyter]
L --> M[Re-make the workflows using Raku]
M --> N[Re-make the workflows using Python]
N -.-> Nen([English])
N -.-> Nru([Russian])
C -.-> WordPress{{Word Press}}
K -.-> WordPress
E -.-> |Deleted:<br>features Raku| WolframCom{{Wolfram Community}}
G -.-> WolframCom
I -.-> |"Deleted:<br>not in English"|WolframCom
D -.-> MG[[Markdown::Grammar]]
B -.-> Ju{{Jupyter}}
L -.-> jupytext[[jupytext]]
J -.-> M2MD[[M2MD]]
E -.-> RakuMode[[RakuMode]]
END

say mermaid-ink($diagram, format => 'md-image');

Clarifications

Russian versions

The first Carlson-Putin interview that is processed in the notebooks was held both in English and Russian. I think just doing the English study is “half-baked.” Hence, I did the workflows with the Russian text and translated to Russian the related explanations.

Remark: The Russian versions are done in all three programming languages: Python, Raku, Wolfram Language. See [AAn4, AAn5, AAn7].

Using different programming languages

From my point of view, having Raku-enabled Mathematica / WL notebook is a strong statement about WL. Fair amount of coding was required for the paclet “RakuMode”, [AAp4].

To have that functionality implemented is preconditioned on WL having extensive external evaluation functionalities.

When we compare WL, Python, and R over Machine Learning (ML) projects, WL always appears to be the best choice for ML. (Overall.)

I do use these sets of comparison posts at Wolfram Community to support my arguments in discussions regarding which programming language is better. (Or bigger.)

Example comparison: WL workflows

The following three Wolfram Community posts are more or less the same content — “Workflows with LLM functions” — but in different programming languages:

Example comparison: LSA over mandala collections

The following Wolfram Community posts are more or less the same content — “LSA methods comparison over random mandalas deconstruction”, [AAv1] — but in different programming languages:

Remark: The movie, [AAv1], linked in those notebooks also shows a comparison with the LSA workflow in R.

Using Raku with LLMs

I generally do not like using Jupyter notebooks, but using Raku with LLMs is very convenient [AAv2, AAv3, AAv4]. WL is clunkier when it comes to pre- or post-processing of LLM results.

Also, the Raku Chatbooks, [AAp5], provided better environment for display of the often Markdown formatted results of LLMs. (Like the ones in notebooks discussed here.)


References

Articles

[AA1] Anton Antonov, “Workflows with LLM functions”, (2023), RakuForPrediction at WordPress.

[AA2] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (2024), RakuForPrediction at WordPress.

[AA3] Anton Antonov, “LLM помогает в обработке первого интервью Карлсона-Путина”, (2024), MathematicaForPrediction at WordPress.

[AA4] Anton Antonov, “Markdown to Mathematica converter”, (2022). Wolfram Community.

Notebooks

[AAn1] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (Raku/Jupyter), (2024), RakuForPrediction-book at GitHub/antononcube.

[AAn2] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (Raku/Mathematica), (2024), WolframCloud/antononcube.

[AAn3] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (WL/Mathematica), (2024), WolframCloud/antononcube.

[AAn4] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (in Russian), (WL/Mathematica), (2024), WolframCloud/antononcube.

[AAn5] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (in Russian), (Raku/Jupyter), (2024), RakuForPrediction-book at GitHub/antononcube.

[AAn6] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (Python/Jupyter), (2024), PythonForPrediction-blog at GitHub/antononcube.

[AAn7] Anton Antonov, “LLM aids for processing of the first Carlson-Putin interview”, (in Russian), (Python/Jupyter), (2024), PythonForPrediction-blog at GitHub/antononcube.

Packages, paclets

[AAp1] Anton Antonov, LLM::Functions Raku package, (2023-2024), GitHub/antononcube.

[AAp2] Anton Antonov, LLM::Prompts Raku package, (2023), GitHub/antononcube.

[AAp3] Anton Antonov, Markdown::Grammar Raku package, (2022-2023), GitHub/antononcube.

[AAp4] Anton Antonov, RakuMode WL paclet, (2022-2023), Wolfram Language Paclet Repository.

[AAp5] Anton Antonov, Jupyter::Chatbook Raku package, (2023-2024), GitHub/antononcube.

[AAp6] Anton Antonov, WWW::MermaidInk Raku package, (2023), GitHub/antononcube.

[KPp1] Kuba Podkalicki’s, M2MD WL paclet, (2018-2023), GitHub/kubaPod.

Videos

[AAv1] Anton Antonov “Random Mandalas Deconstruction in R, Python, and Mathematica (Greater Boston useR Meetup, Feb 2022)” (2022), YouTube/@AAA4Prediction.

[AAv2] Anton Antonov, “Jupyter Chatbook LLM cells demo (Raku)” (2023), YouTube/@AAA4Prediction.

[AAv3] Anton Antonov, “Jupyter Chatbook multi cell LLM chats teaser (Raku)”, (2023), YouTube/@AAA4Prediction.

[AAv4] Anton Antonov “Integrating Large Language Models with Raku”, (2023), YouTube/@therakuconference6823.

[AAv5] Anton Antonov, “Markdown to Mathematica converter (CLI and StackExchange examples)”, (2022), Anton A. Antonov’s channel at YouTube.

[AAv6] Anton Antonov, “Markdown to Mathematica converter (Jupyter notebook example)”, (2022), Anton A. Antonov’s channel at YouTube.

2 thoughts on “Notebook transformations

Leave a comment