<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Vita ex machina]]></title><description><![CDATA[Thoughts, ethics and tutorials in generative AI for ecological, environmental and socio-ecological scientists]]></description><link>https://vitaexmachina.substack.com</link><generator>Substack</generator><lastBuildDate>Thu, 07 May 2026 18:05:58 GMT</lastBuildDate><atom:link href="https://vitaexmachina.substack.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Chris Brown]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[vitaexmachina@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[vitaexmachina@substack.com]]></itunes:email><itunes:name><![CDATA[Chris Brown]]></itunes:name></itunes:owner><itunes:author><![CDATA[Chris Brown]]></itunes:author><googleplay:owner><![CDATA[vitaexmachina@substack.com]]></googleplay:owner><googleplay:email><![CDATA[vitaexmachina@substack.com]]></googleplay:email><googleplay:author><![CDATA[Chris Brown]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Quick and easy AI generated games for teaching]]></title><description><![CDATA[Generative AI can be used to make fun games so your lectures are more engaging]]></description><link>https://vitaexmachina.substack.com/p/quick-and-easy-ai-generated-games</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/quick-and-easy-ai-generated-games</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Wed, 22 Apr 2026 07:00:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!G_CK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>One of the benefits of generative AI is quick prototyping of web apps. I&#8217;ve used this advantage to bring simple games into my lectures. This helps make slide heavy lectures more engaging and gives the students an alternative pathway for learning.</p><p>The games are nothing sophisticated. The graphics are simple and there are at most 1-2 levels. The advantage is that they are cheap and easy to make. So you can bash one together and try it out in a lecture without a big time investment and with minimal technical know-how.</p><p>The simplest games are about an equivalent investment of time as making the powerpoint slides I would have used before.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!G_CK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!G_CK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 424w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 848w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 1272w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!G_CK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png" width="501" height="511" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:511,&quot;width&quot;:501,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!G_CK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 424w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 848w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 1272w, https://substackcdn.com/image/fetch/$s_!G_CK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc3481614-1008-4aa9-92ee-e0b76a4f7e12_501x511.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The seabird game</p><p>I&#8217;ll show a few examples (a more comprehensive list is below). Then I want to discuss how to build these games. Its pretty simple, there&#8217;s just a few steps to follow.</p><p>I teach in marine science and statistics, so they are all pretty marine biology and stats focused. But hope you get some ideas to make your own for whatever discipline you teach in.</p><p><strong><a href="https://www.seascapemodels.org/connections-game/">Word match game</a></strong> inspired by the New York Times connections puzzle. I made my own version that lets me customize what words you have to match. This one is great for a quick recap quiz 10 or 20 minutes into a lecture. The students find it more fun (and challenging) than regular multi-choice quizzes.</p><p><strong><a href="https://www.seascapemodels.org/seabird-block-game/">Seabird block game</a></strong>. This ones takes a bit more explaining. But the basic idea is that the player is a seabird and they have to navigate a wind field to catch fish and feed their chick. Meanwhile they are competing for fish with a fishing boat.</p><p><strong><a href="https://www.seascapemodels.org/impact-eval-app/">Impact evaluation and monitoring design</a></strong>. This one is more technical. It is about choosing sites on a map to do ecological monitoring. The lesson is in choosing a representative set of sites that gives you a statistically accurate answer.</p><p>Then I&#8217;ve also started on a more ambitious project, which I call <strong><a href="https://www.seascapemodels.org/FANGS/">FANGS</a></strong>. This one attempts to recreate a complex statistical software in the web browser. This means students can try out this stats method for a quick intro without having to go through software installation, which can I find can waste an hour or more of class time.</p><p>FANGS is still a work in progress and has taken 30+ prompts (the others were more like 1-2 prompts). But the prototype works well so I&#8217;m going to keep refining it as I have time.</p><h2>Learning goal</h2><p>There&#8217;s a few key elements to setting up these games.</p><p>First you need a learning goal. The game should be integrated into your teaching. So think about what lesson you want your students to learn from playing it.</p><p>For the connections game its just about recapping on key technical words. For the seabird game we follow up with a discussion about seabird conservation, drawing on the conflicts that emerge in the game. I designed the game with those conflicts so as to create these discussion opportunities.</p><p>For the statistical games there are precise mathematical concepts I am trying to teach the students.</p><p>Adding pop-ups with questions is one way to do this. Or you can just discuss in class.</p><p>The impact evaluation and monitoring design game deals with more complex statistical concepts. That one isn&#8217;t self contained in the webpage, I walk through it in class before setting the students loose on the game.</p><h2>Game building and hosting</h2><p>Here&#8217;s the process I follow. I first write detailed instructions for what I want. Being detailed and specific is key, the prompt can run into 100s or 1000s of words. If the game doesn&#8217;t work out its often best to refine your prompt then start again.</p><p>I&#8217;m using Claude Code and/or Github Copilot in VSCode.</p><p><a href="https://github.com/cbrown5/connections-game/blob/main/game-idea.md">Here&#8217;s the prompt I used for the word match game</a> and <a href="https://github.com/cbrown5/seabird-block-game/blob/main/game-idea.md">here&#8217;s the one for the seabird game</a>. Some testing and follow-up prompting was required to get rid of bugs and make them work like I wanted.</p><p>The other key is to have the AI agent create the game as a standalone webpage. Then you don&#8217;t need a server to run the game, the user just goes to the webpage and it runs in their browser. For this reason you need to keep the games pretty small and the graphics simple, otherwise it may run very slowly.</p><p>Play the game locally to check it works and the logic is robust (usually you can just open the file in a web browser). If there are bugs, or worse logical errors you can guarantee the students will find them. Also you don&#8217;t want the students to learn the wrong lessons.</p><p>Once I&#8217;m happy with the game I sync the code repository to Github. Github is a cloud service for storing code. There&#8217;s plenty of tutorials online if you don&#8217;t know how to use it (or AI can help you). Its pretty straightforward.</p><p>Once its on github you need to setup <a href="https://docs.github.com/en/pages">Github Pages</a> for your repository. This means people can navigate to a URL and the code will run in their browser (rather than them just seeing the raw code). For instance, the code for the seabird game lives here: https://github.com/cbrown5/seabird-block-game/</p><p>But I&#8217;ve activated &#8216;pages&#8217; on that repository, so if you go here your browser will load the code so you can play: cbrown5.github.io/seabird-block-game</p><h2>Future improvements</h2><p>I&#8217;d love to have more time to invest in this system, so I can create compelling games quickly and focus on learning elements rather than getting the technical aspects right. A few things we need to make are:</p><p><strong>Standard prompts/specification sheets for game code</strong></p><p>Including elements like what javascript packages to use (there are specialized gaming ones like Phaser3), how to render graphs, and a template for quick quiz pop-ups.</p><p><strong>System for creating graphics</strong> Currently the games use emoji&#8217;s or the AI draws the graphics as an SVG (ie it writes code to create coordinates that draw lines), which is an ineffective way to generate images with AI <a href="https://simonwillison.net/2026/Apr/8/muse-spark/">see these pelicans on bicycles</a>. These methods are quick, but also not as compelling as I would like.</p><p>A more effective system would use an image generation model (like Nano Banana) to make sprite sheets. This should be possible using <a href="https://minimaxir.com/2025/12/nano-banana-pro/">Max Woolf&#8217;s system for creating images in a grid</a>. But my attempts have required a lot of manual handling, as the images are never quite perfectly on a grid. I&#8217;m sure someone will sort this out for us soon and put it into a nice &#8216;skill&#8217;.</p><p><strong>Multiplayer games and class feedback</strong></p><p>A downside to a fully browser based experience is that there is no data exchange with a server. So that precludes the possibility of multiplayer games, or getting a summary of student answers/outcomes. Firebase may be one way to cheaply create a multiplayer game or get summary stats on student engagement (like how many won/lost).</p><p><strong>Accessible learning</strong></p><p>My prompts aren&#8217;t optimised for accessibility, e.g. font sizes and colours. This would be an easy win.</p><p><strong>Games in assessment</strong></p><p>My big picture dream is to have some of the smaller assessment items replaced with games. So rather than the weekly quiz, the students have to solve a puzzle or win a game. I have this idea of an RPG game where the player wanders around our marine science laboratories. In each lab they collect some data. The final &#8216;boss battle&#8217; for the assessment is putting that data together into a coherent analysis to answer a topical research question. Why can&#8217;t assessment be fun too?</p><p>Anyway, I recommend starting small with low ambitions and building up from there. Let me know how you go in your classes if you try this.</p><h2>List of games to date</h2><p><a href="https://www.seascapemodels.org/connections-game/">Word match game</a>. Code and instructions are here if you want to make your own version: https://github.com/cbrown5/connections-game</p><p>Or upload your own <a href="https://www.seascapemodels.org/connections-game/word-match-upload.html">word match data file</a></p><p><a href="https://www.seascapemodels.org/seabird-block-game/">Seabird block game</a>. Catch fish to hatch your egg and successfully raise your chick.</p><p><a href="https://www.seascapemodels.org/seabird-block-game/ocean-game">Larval fish avoiding jellyfish and trying not to starve</a></p><p>And there&#8217;s even more games, these ones I did &#8216;live&#8217; for a presentation where the <a href="https://www.seascapemodels.org/posts/2025-09-17-ai-generated-scicomm-games/">audience developed a game for science communication and played it all within an hour</a>.</p><h3>Statistical games</h3><p><a href="https://www.seascapemodels.org/impact-eval-app/">Impact evaluation and monitoring design</a></p><p>Game code for my stats class: https://github.com/cbrown5/KSM721-games</p><p>Including <a href="https://www.seascapemodels.org/KSM721-games/likelihood-distribution-game/">likelihoods</a> and <a href="https://www.seascapemodels.org/KSM721-games/regression-likelihood-game/">regression</a>.</p><p><a href="https://www.seascapemodels.org/mermaid-to-stan/">App to turn a diagram into statistical code</a>.</p><p><a href="https://www.seascapemodels.org/FANGS/">My most ambitious AI generated project FANGS, a Bayesian computation engine that runs in a web browser</a>. Warning its a work in progress!</p><p>Finally, I&#8217;ve made a few just for fun too. <a href="https://www.seascapemodels.org/checklist-game/">These checklist games</a> are meant to help my kids get ready in the morning.</p>]]></content:encoded></item><item><title><![CDATA[Do AI coding agents save scientists time?]]></title><description><![CDATA[They speed up initial code development, but then tests for errors can add more time. Its still an open question if there is a net benefit]]></description><link>https://vitaexmachina.substack.com/p/do-ai-coding-agents-save-scientists</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/do-ai-coding-agents-save-scientists</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Mon, 13 Apr 2026 23:14:52 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!duy2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I&#8217;m often asked if using AI coding agents saves time. Yes they write code very quickly and can <a href="https://onlinelibrary.wiley.com/doi/10.1111/faf.70079">complete entire ecological data analyses</a>. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!duy2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!duy2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!duy2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!duy2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!duy2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!duy2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1052298,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/194130473?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!duy2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!duy2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!duy2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!duy2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F02f36020-849b-4211-a9e5-285f58dc0d2a_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Do agents really help when the deadlines are approaching?</figcaption></figure></div><p>But the code also requires careful checking for logical errors. Our recent analysis shows this. The best LLMs can complete entire analyses and all the code works well. But there was a decent chance of subtle logical errors. These logical errors would require pretty deep human understanding of the code to correct. </p><p>There&#8217;s another issue and that is using code you don&#8217;t understand. I often find the agents produce so much code, but I&#8217;m not comfortable using it until I understand the logic line-by-line. </p><p>In those cases I find its faster to use an autocomplete AI assistant so I&#8217;m going line-by-line, rather than an agentic loop that completes the entire piece of work. </p><p>I think the jury is still out on this question of whether there is a net time benefit to using agents. The only way to really answer is a randomised control trial where you time how long it takes scientists to fully complete a task. </p><p>The only study I&#8217;m aware of is quite limited and looked at software developers. They found the <a href="https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/">developers often projected they would be faster with the AI tools, but were actually slower at tasks by the end of the project.</a> </p><p>Its true that using AI is fun because it makes so much progress, but that fun feeling might be a trap for us. </p><p>Its likely that the answer is context dependent. </p><p>I suspect for scientists most of the coding we do (like writing models that represent ecosystems) actually requires the human to understand what it does. In these cases agents don&#8217;t make sense, because you need to go back and review the code carefully to understand it anyway. </p><p>On the other hand, if you are making software tools that are easy to verify then agents are great. For instance, I often use them to write code for non-standard figures. I don&#8217;t need to know the code in that case because I can check the output is correct visually. </p><p>Likewise interactive shiny apps are another example of time saving. The agent can take some (good) code you already have and turn it into an app. Its easy to test and check because you just use the app. </p><p>People often point to advances in LLMs and say that soon they will be good enough to do all the coding for us. I&#8217;m not so sure that applies to science. <a href="https://onlinelibrary.wiley.com/doi/10.1111/faf.70079">In fact, we found the later version of Claude Sonnet performed about the same as an earlier version on scientific logic, it just made different types of errors.</a> </p><p>I think the advances need to come in the ways we interact and use the LLMs. </p><p>The ultimately goal should be efficient but also high quality work. That&#8217;s something I want to look at in my next agentic AI study. </p><p></p><p></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[The problem with paper reviews by large language models]]></title><description><![CDATA[They tend to restate shortcomings that were identified in the manuscript]]></description><link>https://vitaexmachina.substack.com/p/the-problem-with-paper-reviews-by</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/the-problem-with-paper-reviews-by</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Tue, 07 Apr 2026 21:56:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The &#8216;T&#8217; in &#8216;GPT&#8217; stands for transformer. This means the LLM is taking text and transforming it to write a response. </p><p>The problem for LLM written peer-reviews is that they offer suggestions that responding to is a waste authors time. LLMs tend to rehash shortcomings already identified in the manuscript. In my experience this ends up being AI slop not new insights. </p><p>I&#8217;ve been running an &#8216;LLM pre-review&#8217; of my manuscripts for a couple of years now. This helps identify problems before you submit the article and make improvements. Some of those recommendations the LLM makes are truly helpful alternative perspectives on the manuscript. </p><p>But many others are just restatements of what was already addressed as a shortcoming. </p><p>In my experience of publishing human reviewers will usually be satisfied with a well-written Discussion that acknowledges a study&#8217;s shortcomings. That is good, because no science is perfect and its important to acknowledge shortcomings that set the scene for future studies.</p><p>An LLM will read shortcomings and &#8216;transform&#8217; them: so it will tend to ask for further analyses or data collection. And the LLM will pretty much always ask for further analyses, even if you&#8217;ve done the analyses suggested the first time. Its an endless loop. </p><p>We recently received reviews on a study I was coauthoring. The article had already been reviewed twice at that journal and new analyses added as requested by the reviewers. Unfortunately the editor then chose to send the manuscript to a new reviewer for the final round of reviews. </p><p>This new reviewer sent in a review that looked very &#8216;AI generated&#8217; to me (though I can&#8217;t be sure). The impersonal but professional and polished style of writing was one of the clues. Another was the requests for further analyses on shortcomings we&#8217;d already looked at. The review was not identifying anything new we&#8217;d overlooked. It was simply picking up on the shortcomings we&#8217;d already added analyses to address and asking for further analyses on those. </p><p>LLMs can provide helpful advice on peer-reviews. They can identify oversights that authors have missed. However, on well polished papers they don&#8217;t have much to add, in fact they&#8217;ll just restate what the problems the authors have already identified. </p>]]></content:encoded></item><item><title><![CDATA[AI agents can create convincing ecological models, but you still need to know what you're doing]]></title><description><![CDATA[We ran an AI agent on three ecological and fisheries modelling tasks they were amazingly capable but human expertise is still critical.]]></description><link>https://vitaexmachina.substack.com/p/ai-agents-can-create-convincing-ecological</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/ai-agents-can-create-convincing-ecological</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Thu, 02 Apr 2026 21:01:16 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!CUzu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Agentic AI tools like Claude Code can write and run code, fix its own errors, and produce a formatted report with figures. I wanted to know whether that translates into reliable ecological modelling, so we ran a test: three fisheries tasks, four AI models, ten independent runs each, scored against a rubric. The results are published in <a href="https://doi.org/10.1111/faf.70079">Fish and Fisheries</a>.</p><p>We found agents can be genuinely useful, but only if you know how to use them well and only if you know enough about the analysis to catch what they miss.</p><h2>How we did our tests</h2><p>We used <a href="https://roo.cline.bot/">Roo Code</a>, an agentic AI that runs inside VS Code. Unlike a chatbot, it can write code, execute it, read error messages, and iterate autonomously. There are many popular software&#8217;s for agentic AI, Claude Code is the most popular right now. We chose Roo Code because it is open source and fully customizable.</p><p>We gave it detailed specification sheets and asked it to complete three tasks. One was a common ecological modelling task: fitting a generalized linear model (GLM) of fish abundance and coral habitat. The other two were tasks specialized to fisheries modelling: fitting a von Bertalanffy growth curve and running a yield per recruit analysis. We chose these because they are common in ecological sciences, but specialized enough that LLMs probably haven&#8217;t seen many examples in their training data.</p><p>We ran each task 10 times. LLM responses have some randomness, and this multiplies when doing long-running tasks. So consistency is as important to measure as their best performance. We scored every output against a rubric covering accuracy, code quality, and report quality.</p><p>We used four versions of LLMs. Two proprietary models: Claude Sonnet 4.0, Sonnet 4.5 (which came out during review so we added later). One open weight model: Kimi K2 and its &#8216;exacto&#8217; variant.</p><p>During review, Kimi K2 &#8216;exacto&#8217; became available on the <a href="https://openrouter.ai/">OpenRouter</a> platform, so we added that. The exacto routes requests to providers with the best performance. Some providers run it cheaply. Long story-short, exacto performed much better than just requesting any provider&#8217;s version of K2, this highlights the importance of running open weight models on quality hardware.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CUzu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CUzu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 424w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 848w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 1272w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CUzu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png" width="1456" height="1703" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f7458232-09ff-4cea-b852-04998256b163_3551x4153.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1703,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CUzu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 424w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 848w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 1272w, https://substackcdn.com/image/fetch/$s_!CUzu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7458232-09ff-4cea-b852-04998256b163_3551x4153.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Agentic workflows vs AI assisted coding</p><h2>How to use agentic AI for ecological modelling</h2><p>We learned several key lessons about how to get the best out of agentic AI for ecological modelling.</p><p><strong>Write a detailed specification sheet.</strong> Our sheets ran to multiple pages covering analysis aims, data structure, recommended R functions and packages, expected outputs, and file naming conventions. This takes time, but writing a specification forces you to think carefully about what you actually want. <a href="https://github.com/cbrown5/agentic-ai-fisheries/blob/main/Scripts/glm-test-case/glm-readme.md">Here&#8217;s an example</a>.</p><p><strong>Specify the algorithms explicitly.</strong> Agents default to the most common method in their training data, which may not be appropriate for your question. If you want bootstrapped confidence intervals via the <code>boot</code> package, say so.</p><p>Even then, they may not comply: both Claude models in our study repeatedly applied natural mortality to the first age class in the yield per recruit model despite explicit instructions not to. That&#8217;s a subtle error that affected catch estimates&#8212;the numbers that would inform fishery management. These quirks of agent behaviour highlight why expert supervision is essential.</p><p><strong>Run replicates and compare outputs.</strong> Accuracy scores varied substantially between runs. sometimes the agent nailed every parameter; sometimes it got some parts correct but made systematic errors in other parts of the analysis. Running multiple agents and comparing outputs is one way to identify the best solutions.</p><p><strong>Check the things the agent doesn&#8217;t know to check.</strong> None of our agents checked for collinearity between predictors in the GLM, even though it&#8217;s standard practice. We deliberately left it out of the specification to see if they&#8217;d do that. The GLMs ran fine, the results looked coherent, but there was in fact strong collinearity between the predictors. The lesson here is that the agents are good at coding, but their conceptual implementation may be misleading, incomplete or logically flawed.</p><h2>The biggest problem with agentic AI is that it can produce professionally formatted output that contains logical errors</h2><p>The error type that concerns me most is professionally formatted output containing logical errors.</p><p>In our results we saw growth curves that plotted beautifully but used the wrong confidence interval method, or a yield analysis that applies mortality in the wrong sequence. A coding syntax error is immediately obvious. A methodological shortcut embedded in otherwise clean output may be invisible unless you already know what the answer should look like.</p><p>There is a genuine risk that inexperienced researchers will use these tools to produce analyses they cannot evaluate. Experienced researchers may also get overconfident and not check results thoroughly enough. These flaws can then leak through to the applications, as we&#8217;ve seen where human errors in <a href="https://pnas.org/doi/10.1073/pnas.2426166122">ecological modelling impacts decisions on invasive species</a>.</p><p>For scientists with strong quantitative foundations, agents offer a real efficiency gain. The specification sheets and rubrics from our study are in the supplemental materials if you want to adapt them. All our code is available on github if you want to run your own tests (<a href="https://github.com/cbrown5/agentic-ai-fisheries/tree/main/Scripts">Check this folder, each modelling &#8216;test-case&#8217; has the specification sheet and other files</a>)</p><p>The paper is open access: <a href="https://doi.org/10.1111/faf.70079">Brown et al. 2026, Fish and Fisheries</a>.</p>]]></content:encoded></item><item><title><![CDATA[Should researchers feel guilty about data centres that power their AI usage?]]></title><description><![CDATA[We have a window now to advocate for environmentally responsible AI development]]></description><link>https://vitaexmachina.substack.com/p/should-researchers-feel-guilty-about</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/should-researchers-feel-guilty-about</guid><dc:creator><![CDATA[Carla Sbrocchi]]></dc:creator><pubDate>Fri, 27 Mar 2026 07:34:28 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4000" height="2250" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2250,&quot;width&quot;:4000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;an aerial view of a large industrial building&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="an aerial view of a large industrial building" title="an aerial view of a large industrial building" srcset="https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1715026323215-a2dbb71272f6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwyfHxkYXRhJTIwY2VudHJlfGVufDB8fHx8MTc3NDU5Njc3N3ww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@geoffreymoffett">Geoffrey Moffett</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>Whenever the topic of AI tools in research comes up, someone thoughtful raises the thorny issue of the energy, water, and equity concerns associated with expanding reliance on resource-hungry data centres. And the implied conclusion is usually that researchers who use these tools are, at some level, compromising their environmental values. As people who work in marine conservation, we&#8217;re used to sitting with uncomfortable trade-offs. Scientists in our field have always had to navigate the gap between values and the practical realities of working lives. Do I take the flight to the international conference, or skip it and miss the collaboration? Do I drive to work or take public transport? Do I eat the deep-sea fish at the conference dinner because it was on the menu anyway, or make a point of declining?</p><p>Data centres have recently been in the news, either announcements about more investment in them, or editorials about the hidden social and environmental costs associated with their operations (see <a href="https://www.theatlantic.com/magazine/2026/04/ai-data-centers-energy-demands/686064/">here</a>, and <a href="https://www.theguardian.com/australia-news/2026/mar/13/ai-datacentres-environmental-impacts">here</a>). Data centres appear to be synonymous with AI advancement, so it made us wonder if data centre concerns are a version of the same, personal values trade-off question, or is there something more to it that researchers using AI need to explore? We asked ourselves a few questions to prepare for this conversation:</p><h3><em><strong>Does AI use equate to Data Centre use?</strong></em></h3><p>Data centres [large, warehouse-like buildings filled with computing equipment] are used to process, store and manage all kinds of digital data. It's easy to conflate 'data centres' with 'AI', because AI is the largest, new source of demand. But data centres have been quietly powering our working lives for decades. Every email you send, every cloud-based document you collaborate on, every Zoom call, every online banking transaction, every streaming service &#8212; all of it runs through data centre infrastructure. The environmental cost of AI use is real but it sits within a much larger picture of digital energy consumption that we've largely normalised.</p><p>Data centres are also not the only means with which researchers can access AI capabilities. There are a range of increasingly capable open-source models that can be run on consumer grade laptops. Whilst these generally require a bit more set-up and infrastructure than web-based AIs, they allow researchers to localize some (but not all) of the costs of AI use, and have the added benefit of allowing more reproducible workflows and fully private interactions. (If you want to know how to set one up, <a href="https://posit.co/blog/setting-up-local-llms-for-r-and-python/">follow this helpful guide</a>).</p><h3>Does Data Centre use equate to social and environmental harm?</h3><p>There is no doubt that using AI tools served from data centres comes with costs. And whilst we shouldn&#8217;t ignore these or neglect to seek ways to reduce them, we also shouldn&#8217;t lose sight of how these costs compare to other costs of doing research, or our regular daily activities.</p><p>The carbon footprint of data centres varies <a href="https://www.sciencedirect.com/science/article/pii/S2589004225019662">by provider and region </a>. Based on best estimates, a typical text-based AI query uses about <a href="https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use">0.2-0.3 watt-hours (Wh)</a>. The energy consumed depends on the model used, and increases with the complexity of the query &#8211; generating an image may use around 0.5Wh. For comparison, analysis from <a href="https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use">You (2025)</a>, <a href="https://www.nature.com/articles/d41586-025-00616-z">Chen (2025)</a>, <a href="https://www.forbes.com/sites/johnkoetsier/2025/12/03/new-data-ai-is-almost-green-compared-to-netflix-zoom-youtube/">Koetsier (2025)</a> shows that microwaving something for 30 seconds uses 8 Wh, charging a smart phone uses about 22 Wh, and watching an hour of netflix uses about 120 Wh. Boiling a kettle uses around <a href="https://www.morphyrichards.co.uk/blogs/kettle-guides/how-many-watts-does-a-kettle-use?srsltid=AfmBOopl4ppsIQUSmiI4PbWNkceZrtNRaTVHFNjIAOjlA930ZZ-Q0wrH">200 Wh</a>. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!l6ZL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!l6ZL!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 424w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 848w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 1272w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!l6ZL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png" width="1406" height="864" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:864,&quot;width&quot;:1406,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!l6ZL!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 424w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 848w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 1272w, https://substackcdn.com/image/fetch/$s_!l6ZL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff90f3bbc-b283-4150-946c-9b8419204cc0_1406x864.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The image summarises primary AI tasks and energy usage (data shows energy consumption for 1000 repetitions of the task, data plotted on a logarithmic scale for easier comparison). Figure sourced from <a href="https://www.nature.com/articles/d41586-025-00616-z">Chen 2025</a>. </figcaption></figure></div><p>The energy consumed by an hour of intensive LLM analysis (uploading a couple of hundred pages of text and running a series of queries) is roughly comparable to a <a href="https://www.forbes.com/sites/johnkoetsier/2025/12/03/new-data-ai-is-almost-green-compared-to-netflix-zoom-youtube/">standard one-hour Zoom call</a>: not negligible, but probably not the uniquely catastrophic activity it's sometimes made out to be. </p><p>Water usage is another concern. Water used in data centres seems to be mostly a direct function of how much energy is used (because water cools equipment after it generates heat from using electricity). <a href="https://blog.andymasley.com/p/individual-ai-use-is-not-bad-for?open=false#%C2%A7water-use">Andy Masley&#8217;s workings</a> show that the American energy sector as a whole uses 60 trillion litres of water each year, enough for 25 million Olympic swimming pools. In contrast, ChatGPT uses 1,500 Olympic swimming pools of water (with the water used for energy purposes), or 0.006% of America&#8217;s total water used for energy.</p><p>Having said all that, tech companies haven&#8217;t been forthcoming about information on power usage (<a href="https://www.nature.com/articles/d41586-025-00616-z">Chen 2025</a>, <a href="https://www.sciencedirect.com/science/article/pii/S2589004225019662">Hankendi 2025</a>), so that makes getting accurate numbers difficult. Its likely that power usage per token will go down in future as the tech gets better - there is strong economic pressure on generating the same content for cheaper. BUT, overall usage could go up dramatically. Earlier estimates put ChatGPT energy and water use per person as inconsequential. However, as we scale up the use of agents and integrate those into everyday work, the energy use goes up dramatically. In early days, it was just people asking ChatGPT simple questions. Now, we&#8217;re using GenAI and AI-automation in more complex ways, and across more workflows (eg. asking Claude Desktop to make powerpoint slides), <a href="https://www.simonpcouch.com/blog/2026-01-20-cc-impact/">using a thousand times more tokens</a>. Multiply this by thousands of new users each day and what was negligible impact becomes significant. <a href="https://www.nytimes.com/2026/03/20/technology/tokenmaxxing-ai-agents.html">Ege Erdil</a>, the co-founder of Mechanize, says &#8220;If you have continuously running agents, you&#8217;ll do 700 million tokens a week from a single full-time agent.&#8221;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!siwZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!siwZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 424w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 848w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 1272w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!siwZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png" width="1152" height="711" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:711,&quot;width&quot;:1152,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!siwZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 424w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 848w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 1272w, https://substackcdn.com/image/fetch/$s_!siwZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4f9d7f29-8d0c-49b2-9e4d-3c6a4ff4b423_1152x711.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Energy usage of various daily activities including LLM interactions. The user&#8217;s data reflects their description as an &#8220;extreme power user&#8221;.  Figure sourced from <a href="https://www.simonpcouch.com/blog/2026-01-20-cc-impact/">Simon P Couch, 2026</a>.</figcaption></figure></div><p></p><h3><em><strong>An Australian Case Study</strong></em></h3><p>Australia currently has <a href="https://www.datacentermap.com/australia/">279 data centres</a> and is becoming <a href="https://www.climatecouncil.org.au/what-does-the-data-centre-boom-mean-for-australias-switch-to-renewables/">one of the world&#8217;s top-five data centre markets</a>&#8212; an extraordinary position for a country of its size. Australia is attractive because of its stable regulatory environment, physical location as a gateway to Asia and the Pacific, and the potential for large-scale renewable energy generation. The sector is growing fast.</p><p>Market analysis shows that data centre electricity consumption could make up 11% of Australia&#8217;s demand by 2035, a huge leap from 1% today - potentially affecting electricity availability and prices for individual consumers (<a href="https://www.cefc.com.au/media/media-release/data-centre-boom-to-reshape-australia-s-energy-future-cefc-baringa-report/">CEFC 2025</a>, <a href="https://www.pv-tech.org/social-backlash-inevitable-industry-demands-data-centres-stop-freeloading-on-australias-clean-energy/">Heynes 2026</a>).</p><p>Australian policy appears to be keeping up with the regulatory demands of the growth in data centres that has plagued other parts of the world, such as the USA. From July 2025, all data centres hosting federal government workloads are required to achieve a minimum five-star NABERS rating under the <a href="https://www.finance.gov.au/sites/default/files/2023-11/Net_Zero_Government_Operations_Strategy.pdf">Net Zero in Government Operations Strategy</a>. Late last year, the Australian Government launched the <a href="https://www.industry.gov.au/publications/national-ai-plan/national-ai-plan-page">National AI Plan</a> and, just this week, released a policy that articulates <a href="https://www.industry.gov.au/publications/expectations-data-centres-and-ai-infrastructure-developers">expectations of data centres for their license to operate in Australia</a>. These expectations require data centre projects to demonstrate &#8220;benefit to the Australian economy, people and their local communities&#8221; to receive priority for approval or investment.</p><p>This policy expects data centre projects to support Australia&#8217;s energy transition (eg data centres should not place upward pressure on energy prices, should minimise energy demand, and use clean energy where available); minimise water usage (eg mitigate impacts of water disruptions, drought and climate change, use secure, non-potable water); create fair, safe, secure and well-paid jobs for Australian workers; and support research and innovation across all sectors at favourable terms. Further, in NSW, an ongoing <a href="https://www.parliament.nsw.gov.au/lcdocs/inquiries/3169/Terms%20of%20reference%20-%20PAWC%20-%20Data%20centres%20-%20updated%205%20February%202026.pdf">parliamentary enquiry into data centres</a> will help governments and communities to set operating rules for how data centre projects should be considered in light of environmental factors, planning frameworks, electricity demand, community impacts, housing availability, workforce considerations, and economic and distributional outcomes.</p><p>Although criticisms have been levelled at data centres from a range of climate-concerned organisations, these same organisations agreed that &#8220;<strong>handled wisely, [Australia&#8217;s] new demand could become a powerful driver for renewable energy investment. Data centres could anchor new solar, wind and battery projects, financing firming capacity and supporting regional economic development&#8221;</strong> (<a href="https://www.climatecouncil.org.au/what-does-the-data-centre-boom-mean-for-australias-switch-to-renewables/">Climate Council 2025</a>). Data centres have the potential to accelerate and anchor critical, renewable energy transitions.</p><div class="pullquote"><p><strong>Handled wisely, this new demand could become a powerful driver for renewable energy investment. Data centres could anchor new solar, wind and battery projects, financing firming capacity and supporting regional economic development.</strong></p></div><h3><em><strong>Is the moral response prohibition, or governance and harm-reduction?</strong></em></h3><p>We need to have a serious conversation about the environmental cost of the digital infrastructure underpinning all research.  We started out with a provocation on whether researchers should feel guilty: the more productive version of the question isn&#8217;t, &#8220;<em>Should I feel guilty for using Claude to summarise this literature?&#8221;</em> but rather: <em>&#8220;What choices do I have, as a researcher and as a citizen, to take advantage of cutting-edge tools without needlessly contributing to harmful trends?&#8221;</em></p><p>Some of these are professional choices: advocating within institutions for procurement policies that favour data centre providers with genuine renewable commitments; being transparent in methods sections about which tools you use (so the field can track cumulative usage and its implications); asking questions of our own institutions about their digital data footprint.</p><p>Some are civic choices such as engaging with national data centre strategy consultations if you have relevant expertise, and advocating for greater transparency and accountability on the environmental impact of AI agents.</p><p>And some are just practical: using smaller, more efficient models when a task doesn't require the largest one; just doing a google search rather than an AI-based internet query; being specific about prompts so your queries are targeted and efficient. Not because these individual choices transform the system, but because they reflect a disposition toward thoughtful use rather than uncritical adoption &#8212; which, for a researcher, is the right posture anyway.</p><p>The conversation about data centres and AI usage is broader than we initially imagined - it&#8217;s more than personal values, and is includes the realities of infrastructure, governance, and investment. In Australia and around the world, right now, there's actually a window to shape how this plays out. That seems like a good place to put our energy.</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/p/should-researchers-feel-guilty-about?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://vitaexmachina.substack.com/p/should-researchers-feel-guilty-about?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://vitaexmachina.substack.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Thanks for subscribing]]></title><description><![CDATA[A rundown of recent posts]]></description><link>https://vitaexmachina.substack.com/p/thanks-for-subscribing</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/thanks-for-subscribing</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Wed, 25 Mar 2026 23:16:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hello to new subscribers and thanks for subscribing. As promised we&#8217;ll try to limit emails to occasional newsletters.  Here&#8217;s a run-down of recent posts, as well as old favourites ones recent subscribers might have missed. </p><p>In coming weeks we have some posts planned about the growing environmental cost of generative AI and a test of who thoroughly agentic AI can autonomously complete ecological and fisheries modelling tasks. </p><h2>Recent Posts</h2><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;57381042-db77-4968-86b2-2705a12de0bc&quot;,&quot;caption&quot;:&quot;Carla: Many of our colleagues are struggling with two big challenges: the ethics of using AI (mostly GenAI) tools in their research and teaching, and developing their own literacy and capabilities in using AI tools. We hope that the blog can shed some light on how to build up key capabilities through various use cases (how to use GenAi in coding, how to&#8230;&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Why vita ex machina? &quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:423976327,&quot;name&quot;:&quot;Carla Sbrocchi&quot;,&quot;bio&quot;:&quot;Marine social scientist by day&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/efcea2b2-7ed9-472a-aef3-fa5c9394d975_144x144.png&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-03-24T07:06:30.919Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!hsqe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/why-vita-ex-machina&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:191953066,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;1d324c79-566f-477b-a810-f632478da3f5&quot;,&quot;caption&quot;:&quot;Like it or not, everyone is using large language models to help do their statistics.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Prompting large language models for quality ecological statistics&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:421587541,&quot;name&quot;:&quot;Chris Brown&quot;,&quot;bio&quot;:&quot;Associate Prof of Fisheries Science, helping data tell stories about ocean ecosystem management. Fisheries, climate change, marine biodiversity, statistics, modelling&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a6c5918-167a-4d31-bf0d-b0f091f13713_874x874.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-03-16T22:12:32.837Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!CLQA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/prompting-large-language-models-for&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:191189775,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;5d26d736-5946-40e1-89cb-e83e7e5bedda&quot;,&quot;caption&quot;:&quot;The writing is on the wall, the genie is out of the bottle, and Pandora&#8217;s box has cracked wide open&#8212;however you want to say it, generative AI is here, and it&#8217;s rapidly reshaping how we build ecosystem models.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;LLM-Enabled Mechanistic Modelling&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:444213218,&quot;name&quot;:&quot;Scott Spillias&quot;,&quot;bio&quot;:&quot;Marine and environmental research scientist in Tasmania. Interested in oceans, models, and how AI is reshaping science, policy, and decision&#8209;making.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3e0196a4-b825-4b0e-b4e1-536d4ce286bc_144x144.png&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-03-13T03:10:34.325Z&quot;,&quot;cover_image&quot;:null,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/llm-enabled-mechanistic-modelling&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:189725281,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;8babe0e4-e0dd-4378-ac3d-b4af049e0837&quot;,&quot;caption&quot;:&quot;I recently came across this writeup by Celina Zhao about the Hao et al. (2026) paper about AI tools (in a very broad sense) making researchers in the natural sciences more productive (more publications, more citations, faster career advancement somehow), although at the expense of the breadth of topics and domains people are working on.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;AI-enhanced scientists?&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:15653794,&quot;name&quot;:&quot;Luis D. Verde Arregoitia&quot;,&quot;bio&quot;:&quot;Mammals, R, macroecology&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/43aa6f7b-9456-406c-9ee0-316672df1cf8_1599x1599.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-03-02T17:20:54.931Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!_g4R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/ai-enhanced-scientists&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:189673853,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:1,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><h1>Old favourites</h1><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;1b0700d4-9bbb-4536-8ebb-548643d2300c&quot;,&quot;caption&quot;:&quot;For the past year or so I&#8217;ve tried my best to keep this online guide to tools, packages, and resources for working with LLMs in R up to date. This month I added 8 more packages and an IDE extension.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;LLMs + R resource guide&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:15653794,&quot;name&quot;:&quot;Luis D. Verde Arregoitia&quot;,&quot;bio&quot;:&quot;Mammals, R, macroecology&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/43aa6f7b-9456-406c-9ee0-316672df1cf8_1599x1599.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-02-17T02:45:59.536Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!pJ_U!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/llms-r-resource-guide&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:188213250,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:false,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;8b79c42c-6a53-4026-817b-05e3546b002d&quot;,&quot;caption&quot;:&quot;Image generation models are powerful tools for science communication. But generative AI also (rightfully) has a reputation for hallucinations and making stuff up.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Don't use AI generated photos in scientific presentations&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:421587541,&quot;name&quot;:&quot;Chris Brown&quot;,&quot;bio&quot;:&quot;Associate Prof of Fisheries Science, helping data tell stories about ocean ecosystem management. Fisheries, climate change, marine biodiversity, statistics, modelling&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a6c5918-167a-4d31-bf0d-b0f091f13713_874x874.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-02-06T20:30:22.398Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!MznX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/dont-use-ai-generated-photos-in-scientific&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:186654829,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:2,&quot;comment_count&quot;:1,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;2f379f74-0ab9-49a4-84a9-7a8fd269427e&quot;,&quot;caption&quot;:&quot;I&#8217;ve been playing around with Google Gemini Pro 3 (AKA Nano Banana) as a way to make science infographics. Here are the key lessons from my attempts.&quot;,&quot;cta&quot;:&quot;Read full story&quot;,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;Tips on making science infographics with AI and nano banana&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:421587541,&quot;name&quot;:&quot;Chris Brown&quot;,&quot;bio&quot;:&quot;Associate Prof of Fisheries Science, helping data tell stories about ocean ecosystem management. Fisheries, climate change, marine biodiversity, statistics, modelling&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a6c5918-167a-4d31-bf0d-b0f091f13713_874x874.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2026-02-02T19:56:48.919Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/$s_!zgvB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://vitaexmachina.substack.com/p/tips-on-making-science-infographics&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:186654972,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:1,&quot;comment_count&quot;:0,&quot;publication_id&quot;:7821195,&quot;publication_name&quot;:&quot;Vita ex machina&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!Qt-g!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Why vita ex machina? ]]></title><description><![CDATA[The origin story of the site, from its collaborators]]></description><link>https://vitaexmachina.substack.com/p/why-vita-ex-machina</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/why-vita-ex-machina</guid><dc:creator><![CDATA[Carla Sbrocchi]]></dc:creator><pubDate>Tue, 24 Mar 2026 07:06:30 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!hsqe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hsqe!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hsqe!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hsqe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:912880,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/191953066?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hsqe!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!hsqe!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ecf99e4-33a9-440d-b511-28f8878bf8a6_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Carla: Many of our colleagues are struggling with two big challenges: the ethics of using AI (mostly GenAI) tools in their research and teaching, and developing their own literacy and capabilities in using AI tools. We hope that the blog can shed some light on how to build up key capabilities through various use cases (how to use GenAi in coding, how to use GenAi in literature reviews, how to use GenAi in image creation, etc) and at the same time, discuss the challenges associated with learning to use and apply these tools in our work lives.</p><p>Chris: Technology isn&#8217;t going to solve our environmental problems. But some people will have you believe that.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The name <em>vita ex machina</em> means &#8216;life from the machine&#8217;. It&#8217;s a play on &#8216;<em>deus ex machina</em>&#8217; which means &#8216;god from the machine&#8217;.<em> Deus ex machina </em>is a crude plot device that an author can use to resolve complex plot lines. The Latins would just lower a god down to the stage on a platform (the &#8216;machine&#8217;) and the god would make everything ok.</p><p>I liked this name because lots of pro-tech people are pitching a <em>deus ex machina</em> solution: &#8220;technology will get better and solve humanity&#8217;s problems&#8221;. But that&#8217;s overly simplistic. Technological progress is a big part of the environmental mess we currently find ourselves in. AI isn&#8217;t just going to solve that problem.</p><p>The widespread use of generative AI in science is causing a massive upheaval research methods, but also our norms about what&#8217;s appropriate. So I wanted to have a blog that serves a dual purpose. (1) Shows how we can use generative AI to help environmental conservation. (2) Critical appraisal of what&#8217;s happening with generative AI in research.</p><p>We address (1) with the handy tools and hints posts (like this one <a href="https://url.au.m.mimecastprotect.com/s/DCtjCjZ1nVUjGE5V2CWf0SmfRW3?domain=vitaexmachina.substack.com">https://vitaexmachina.substack.com/p/llms-r-resource-guide</a>). We address (2) when we look at the meatier issues, ethical questions and philosophies (e.g. <a href="https://url.au.m.mimecastprotect.com/s/oxUbCk81oVHnXlWP7tVhDSGzHdg?domain=vitaexmachina.substack.com">https://vitaexmachina.substack.com/p/dont-use-ai-generated-photos-in-scientific</a> and <a href="https://vitaexmachina.substack.com/p/ai-tools-and-the-social-part-of-socio">https://vitaexmachina.substack.com/p/ai-tools-and-the-social-part-of-socio</a>).).</p><p>Luis: I was happy to be invited to participate in this blog after several months of maintaining a guide on tools for working with LLMs in R. For the most part, I tried to remain impartial when reviewing or even just listing packages, extensions, or other relevant software, but <em>vita ex machina</em> is a great outlet for actually voicing my opinions on AI in coding and how this affects research in ecology and evolution. The field is shifting quite fast and at this point AI can be a motivating boost in productivity or a pathway to more awkward slop that ends up getting in the way of our initial goals.</p><p>Scott: The coming wave of AI is set to significantly change how we interact with one another, with knowledge, and with the non-human life we share the planet with. Like any technology, it will not solve all our problems (and it is already creating new ones), but with thoughtful preparation we have a much better chance of using it in ways that do more good than harm.</p><p>When it comes to wicked problems, especially those that emerge at the intersection of society and ecology, the complexity involved makes it impossible for any one person, or even a group of people, to fully synthesise all the relevant information. In these contexts, careful and deliberate use of AI can support more effective decision-making and, ideally, help us collaborate more successfully as we learn to live with and care for nature.</p><p>Realising this potential begins with building familiarity and fluency with these tools, essentially becoming AI literate. I hope some of the material on this blog supports those taking their first steps on that journey.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Prompting large language models for quality ecological statistics]]></title><description><![CDATA[New study looks at how to get scientifically valid stats out of LLMs and LLM agents]]></description><link>https://vitaexmachina.substack.com/p/prompting-large-language-models-for</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/prompting-large-language-models-for</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Mon, 16 Mar 2026 22:12:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!CLQA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Like it or not, everyone is using large language models to help do their statistics.</p><p>There&#8217;s a lot of claims made about whether LLMs can or can&#8217;t do useful ecological statistics. In our new paper <a href="https://besjournals.onlinelibrary.wiley.com/doi/10.1111/2041-210x.70267?af=R">&#8220;Prompting large language models for quality ecological statistics&#8221;</a> (in Methods in Ecology and Evolution) we wanted to test those claims quantitatively.</p><p>We provide guidelines for how to use LLMs to produce scientifically valid statistical analyses. The short version: LLMs can help you do better statistics, but only if you ask well. And &#8220;asking well&#8221; is a learnable skill.</p><p>The paper is co-authored with agentic AI expert Scott Spillias and grew out of our experiences teaching LLMs for statistical analysis.</p><h2>Why we wrote it</h2><p>I&#8217;ve been teaching researchers how to use LLMs for coding and statistics for a few years now. Two things kept coming up. First, most people underestimate how much the prompt matters &#8212; they treat LLMs like a search engine and get search-engine quality advice. Second, some people trust LLM-generated code and statistics uncritically, which is risky.</p><p>Late 2024 studies found LLMs recommended the correct statistical test less than 40% of the time with generic prompts. But accuracy improves substantially with more specific prompts, which is a key skill we focus on in the article.</p><h2>What we found</h2><p>We ran repeatable evaluations in R, replicating each prompt 10 times across multiple LLMs. A few results stood out.</p><p><strong>Specificity matters enormously for test selection.</strong> We compared four prompts for choosing a statistical test for an ecological dataset, ranging from &#8220;How do I test the relationship between two continuous variables?&#8221; to a detailed prompt specifying variable types, sample size, and study design. The generic prompt never suggested count models (the appropriate family for fish abundance data). The detailed prompt guaranteed them, regardless of which LLM we used.</p><p><strong>Detailed prompts make agent-generated code more consistent.</strong> We asked the Github Copilot agent to write an entire analysis workflow from two different prompts &#8212; one brief, one detailed. We then ran multivariate ordination on the resulting code to measure how similar the 10 replicates were to each other. The detailed prompt produced much tighter clusters &#8212; the agent kept using the same functions, variable names, and structure. Inconsistent code is harder to review, which matters if you&#8217;re trying to catch statistical errors.</p><p><strong>Being specific overcomes weaker models.</strong> When we tasked LLMs with writing R code to calculate a distance matrix, the detailed prompt got the right answer 90&#8211;100% of the time across all models. The brief prompt mostly failed &#8212; except for GPT-5 Codex, which guessed correctly 9/10 times. Good prompts effectively compensated for using a smaller, cheaper model.</p><h2>The workflow we recommend</h2><p>We suggest breaking LLM-assisted analysis into three stages, and writing separate prompts for each. This helps you control the workflow and avoid LLM mistakes.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CLQA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CLQA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CLQA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg" width="1456" height="918" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:918,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CLQA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CLQA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F84201c14-864c-48d1-97d6-ec8f64554de3_1830x1154.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Recommended workflow for using LLMs to complement traditional approaches to statistical analysis</p><ol><li><p><strong>Choose the statistical approach</strong> &#8212; describe your variables, sample size, and study design in detail. Attach the data or a summary. Point the LLM to reference material you trust.</p></li><li><p><strong>Plan the implementation</strong> &#8212; before writing any code, get the LLM to help you structure the project directory and scripts. We use a <code>readme.md</code> that includes research context, analysis steps, package preferences, and directory layout. This file then gets attached to every subsequent prompt, giving the LLM a consistent memory across sessions.</p></li><li><p><strong>Write the code</strong> &#8212; with a detailed readme and explicit instructions, agents can complete analyses with minimal supervision. Without that structure, even strong models produce code that&#8217;s hard to review and inconsistent across runs.</p></li></ol><h2>General prompting tips</h2><p>These apply regardless of which stage of analysis you&#8217;re at:</p><ul><li><p><strong>Declare a role upfront</strong> &#8212; start with &#8220;You are an expert in ecological statistics and R.&#8221; This orients the LLM and nudges it toward discipline-appropriate methods.</p></li><li><p><strong>Avoid multi-turn conversation where possible</strong> &#8212; each turn adds context that can&#8217;t be removed. If the conversation goes wrong, start fresh with a better prompt rather than trying to correct course.</p></li><li><p><strong>Use prompt bootstrapping</strong> &#8212; ask the LLM what information it would need to answer your question better, then start a new session with that improved prompt.</p></li><li><p><strong>Attach your own references</strong> &#8212; rather than letting the LLM search the web, point it to tutorials and vignettes you&#8217;ve already vetted. You control the quality of the context.</p></li><li><p><strong>Break problems into steps</strong> &#8212; don&#8217;t ask for everything at once. Separate choosing a method, planning the code structure, and writing the code into distinct prompts.</p></li></ul><h2>The important role for the scientist</h2><p>Statistical expertise is still required. You need to know enough to evaluate whether the LLM&#8217;s suggestions are appropriate, check that code is scientifically valid (not just syntactically correct), and understand what the results mean. Novices who lack that background are more likely to write poor prompts and less likely to catch bad advice.</p><p>We think LLM literacy should be part of statistical training programs, alongside the fundamentals. In our own training we start novices off on R fundamentals and stats first, with no AI other than bug fixes or asking for explanation of unknown code. Once they grasp core concepts we then move onto more advanced forms of AI integration. This needs to introduced done stepwise.</p><p>We&#8217;re also only beginning to understand LLM biases for ecological data. Spatial dependencies, nested designs, and zero-inflated count data are common in ecology but probably underrepresented in LLM training data. There&#8217;s a lot of evaluation work still to do.</p><p>The paper and all the code for the evaluations is at <a href="https://doi.org/10.5281/zenodo.18463012">Zenodo</a>. If you want the full workflow in practice, there&#8217;s also a <a href="https://www.seascapemodels.org/R-llm-workshop/">one-day course</a> and an <a href="https://www.seascapemodels.org/AI-assistants-for-scientific-coding/">online book</a>.</p>]]></content:encoded></item><item><title><![CDATA[LLM-Enabled Mechanistic Modelling]]></title><description><![CDATA[GenAI learns to create ecosystem models]]></description><link>https://vitaexmachina.substack.com/p/llm-enabled-mechanistic-modelling</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/llm-enabled-mechanistic-modelling</guid><dc:creator><![CDATA[Scott Spillias]]></dc:creator><pubDate>Fri, 13 Mar 2026 03:10:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The writing is on the wall, the genie is out of the bottle, and Pandora&#8217;s box has cracked wide open&#8212;however you want to say it, generative AI is here, and it&#8217;s rapidly reshaping how we build ecosystem models.</p><p>We&#8217;ve been exploring this through a project called LEMMA, which stands for LLM&#8209;Enabled Mechanistic Modelling for ecosystem Assessment. It grew out of a simple question: ecologists depend on mechanistic models to understand how ecosystems work and to test management options, but these models take a huge amount of time and careful coding. With climate and biodiversity pressures accelerating, that slow pace is becoming a real bottleneck. At the same time, large language models have become surprisingly good at reading, writing, and coding. This raised the question of whether AI could accelerate the most time&#8209;consuming parts of building these models while keeping everything transparent and grounded in real ecology.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>LEMMA is designed as a co&#8209;pilot for model development. It can propose model equations and generate compilable C++ code for dynamic ecosystem models, making the whole process visible and open to critique. It improves models through iterative generations, keeping strong candidates and discarding weaker ones. It also suggests parameter values by searching curated literature for plausible numbers and ranges. The goal is to produce a set of interpretable, transparent model candidates with clearly documented assumptions.</p><p>We tested LEMMA on two different systems. First, we asked it to recover the structure behind simulated nutrient&#8211;phytoplankton&#8211;zooplankton time series. It produced high&#8209;quality fits and reconstructed the central ecological processes described in decades of NPZ research, occasionally offering alternative but still ecologically sensible formulations.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;c40f4684-6014-4f40-b0a3-94c3d4705f3d&quot;,&quot;duration&quot;:null}"></div><p><em><strong>NPZ reconstruction.</strong> The top panel shows the training curve over generations. The lower panels show simulated nutrient, phytoplankton and zooplankton time series (black points) with LEMMA&#8217;s evolving model predictions (coloured curves). The video shows how candidate models improve and eventually capture the key nutrient&#8211;phytoplankton&#8211;zooplankton dynamics.</em></p><p>Next, we applied it to a real management problem involving crown&#8209;of&#8209;thorns starfish outbreaks and coral cover on the Great Barrier Reef. LEMMA generated models that captured key ecological patterns and successfully predicted years withheld from training, including outbreak timing. Important features like density dependence, coral resource limitation, and recruitment pulses appeared consistently.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;d5208376-79f1-4eca-9b89-c02e343b9074&quot;,&quot;duration&quot;:null}"></div><p><em><strong>COTS&#8211;coral modelling.</strong> The top panel shows the training curve. The second panel plots crown&#8209;of&#8209;thorns starfish abundance, and the two bottom panels show two coral groups. Black points are observed data; coloured curves are predictions from successive model generations. The video illustrates how LEMMA converges on structures that reproduce outbreak timing and coral&#8211;starfish interactions.</em></p><p>As tools like LEMMA develop, they may significantly speed up model prototyping, allowing researchers to focus more on interpretation, testing, and collaboration. Because each model comes with equations and parameter files, the results remain easy to review, compare, and modify. The system can also surface new hypotheses when it proposes unfamiliar but plausible ecological terms. Faster iteration can also help modelling keep pace with real&#8209;world management decisions.</p><p>This is exciting stuff, but we humans shouldn&#8217;t be kicking our feet up just yet. Different runs often produce different but reasonable model structures. Only a small portion of parameters in this version were linked directly to citations, which highlights the need for improved literature integration. And although we worked with relatively compact systems, larger spatial or multi&#8209;system models will require further development. </p><p>The paper outlines practices that support safe use, including expert review, good documentation, stakeholder involvement, and proper validation before applying results to management decisions.</p><p>Looking ahead, we are working on richer literature searches, better parameter provenance, the use of open&#8209;source language models for reproducibility and privacy, new optimisation goals, and ways to integrate LEMMA with ensemble modelling workflows. If you&#8217;re interested in trying LEMMA on your own ecological time series&#8212;whether reefs, fisheries, or coastal systems&#8212;we&#8217;d be keen to talk.</p><p>The full paper, &#8220;Data&#8209;driven discovery of mechanistic ecosystem models with LLMs,&#8221; appears in Methods in Ecology and Evolution (DOI: 10.1111/2041&#8209;210x.70244), and the associated data and code are available through the Zenodo link in the paper&#8217;s Data Availability statement.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[AI-enhanced scientists?]]></title><description><![CDATA[AI tools as potential boosters of academic production]]></description><link>https://vitaexmachina.substack.com/p/ai-enhanced-scientists</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/ai-enhanced-scientists</guid><dc:creator><![CDATA[Luis D. Verde Arregoitia]]></dc:creator><pubDate>Mon, 02 Mar 2026 17:20:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!_g4R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I recently came across this <a href="https://www.science.org/content/article/ai-has-supercharged-scientists-may-have-shrunk-science">writeup</a> by Celina Zhao about the Hao et al. (2026) <a href="https://www.nature.com/articles/s41586-025-09922-y">paper</a> about AI tools (in a very broad sense) making researchers in the natural sciences more productive (more publications, more citations, faster career advancement somehow), although at the expense of the breadth of topics and domains people are working on.</p><p>It was interesting to see a quantitative measure of impact, which seems a bit high to me (e.g. publishing 3.02 times more papers thanks to these tools), but it somewhat supports the common saying (as restated by Zhao) that &#8220;<em>AI won&#8217;t replace you, but someone using AI might&#8221;</em>. I don&#8217;t know about AI replacing people, or AI users replacing non-users, but reading these pieces strengthened my opinion that AI tools give the greatest advantages to the most advanced coders/writers/programmers.</p><p>Just last week I was watching Hadley Wickham&#8217;s keynote on Claude Code for R as part of the <a href="https://conference.rainbowr.org/">rainbowR</a> conference, in which Hadley said that &#8216;the genie is out of the bottle&#8217; and that thanks to these tools &#8216;software engineering has changed irrevocably&#8217;. Just watching demos by someone that advanced in R and in programming with LLM APIs I became even more convinced that:</p><p>a) Actually learning things is not a waste of time.</p><p>b) The more we know (about programming or scientific writing), the more we can customize and leverage AI tools to help us become more efficient.</p><p>c) Advanced users with TUI tools like Claude Code and expensive subscriptions with lots of tokens to burn have a major advantage over us mortals</p><p>I don&#8217;t consider myself an AI-enhanced anything, but last week I used the <a href="http://github.com/cornball-ai/llamaR">llamar CLI </a>coding agent (written in R!) by Troy Hernandez and had the assistant (with Claude sonnet 4) help me organize and rename the PDF files in a folder according to their contents and dates. I had to do this for multiple files and multiple folders for an important admin process at work, and it had to be done before a deadline. Using a quick prompt in natural language saved me at least 90 minutes of work had I done this manually, and having the assistant read the files and run some shell commands allowed me to submit my files on time. I still checked the outputs manually and everything was OK. Other colleagues also had to submit their evaluation dockets, and I feel bad for those that had to sort and name their files manually. This anecdote is not about academic productivity and it&#8217;s a very specific example, but automating tasks and writing boilerplate with LLMs is certainly a great way to save time and worth the learning curve.</p><p>Just a thought.</p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_g4R!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_g4R!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 424w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 848w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 1272w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_g4R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp" width="1456" height="961" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:961,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;The One Reason 'Terminator 2' Still Holds Up&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="The One Reason 'Terminator 2' Still Holds Up" title="The One Reason 'Terminator 2' Still Holds Up" srcset="https://substackcdn.com/image/fetch/$s_!_g4R!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 424w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 848w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 1272w, https://substackcdn.com/image/fetch/$s_!_g4R!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffc485d6b-bbbb-432f-b35f-40fe48ef94c1_1600x1056.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div>]]></content:encoded></item><item><title><![CDATA[Do we really need to teach students to code?]]></title><description><![CDATA[Since chatGPT, stats academics have been saying that we need to teach coding, but I no longer think we do]]></description><link>https://vitaexmachina.substack.com/p/do-we-really-need-to-teach-students</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/do-we-really-need-to-teach-students</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Mon, 02 Mar 2026 12:48:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2YtK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I&#8217;m just about to publish a paper on using large language models to write code for ecological modelling. In the paper we argue for retaining the teaching of coding in undergraduate programs. Teaching coding is teaching undergraduates to reason quantitatively. </p><p>But I wonder what opportunities we are missing with this strong view that coding is central to statistical training? </p><p>Coding is certainly not the only way to teach reasoning. People were reasoning effectively well before computer coding was invented. </p><p>As an example of alternative possibilities, I vibe coded a webpage that takes mermaid syntax (a flow chart syntax) and turns it into a Bayesian model (specifically written in the Stan language). <a href="https://www.seascapemodels.org/mermaid-to-stan/">Try a prototype of &#8216;mermaid to stan&#8217; here</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2YtK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2YtK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 424w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 848w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 1272w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2YtK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png" width="1456" height="1125" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1125,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:297546,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/189643517?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!2YtK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 424w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 848w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 1272w, https://substackcdn.com/image/fetch/$s_!2YtK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5a0aed62-8d14-4ae7-9e1c-cbe237743aac_2044x1580.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">The Mermaid to Stan application prototype, vibe coded in about 15 minutes&#8230;</figcaption></figure></div><p>Directed Acyclical Graphs (ie flow charts) are a great way to reason when you are building Bayesian models. Arguably much easier to see problems than with code in fact.</p><p>So make your DAG, you get code that you can copy and paste into R. In principle if you understand the DAG you don&#8217;t need to read the R code. (though I should say this is a protoype, so you should read the R code in this case). </p><p>Many students struggle to learn coding and may never need to use it beyond their degree. Of course some will use it extensively. </p><p>We may be better off streaming then so that:</p><ol><li><p>Every gets trained in logical reasoning and stats theory (essential in sciences and I think general life)</p></li><li><p>Students that want to can learn the code</p></li></ol><h2>What about code QA/QC?</h2><p>Detractors will argue you &#8216;need to understand the code to be able to check it&#8217;. </p><p>That&#8217;s true of genAI written code (though maybe not for much longer). </p><p>But if we can create tools like my prototype that have all the necessary quality checks built it. That&#8217;s not a new innovation, its just that generative AI makes this super easy. </p><p>I can now built an app for my students in about 15 minutes. With that capability I could potentially bypass the code altogether. </p><p>What&#8217;s important with AI generated applications is extensive tests. So building tests into the development process is key</p><p>Likewise for students using these tools, they need tests to self-check what they are doing. This could include graphs or simulated data so students can check the model works as intended. </p><h2>Interactive tools for students</h2><p>Coding can be such a huge barrier to teaching new students stats and quantitative methods. They get stuck on details and frustrated by tiny bugs they haven&#8217;t learned to spot. </p><p>I&#8217;d rather teach them statistics and reasoning first, coding second, if they think they&#8217;ll go into a quant heavy career. </p><p>GenAI opens new possibilities for creating apps where we can reason about an analysis in alternative ways to coding, but still translate that reasoning into code that works. </p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Declaring generative AI use in research and university studies]]></title><description><![CDATA[Here's a list of ways generative AI might be used]]></description><link>https://vitaexmachina.substack.com/p/declaring-generative-ai-use-in-research</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/declaring-generative-ai-use-in-research</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Tue, 24 Feb 2026 19:35:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I often find requests to &#8216;declare use of generative AI&#8217; frustratingly vague. There&#8217;s so many different potential uses. And use can be reported at so many different levels of granularity. </p><p>Then there&#8217;s many grey areas. My university&#8217;s policy is that students cannot submit generative AI work as their own, for assignments or theses. How does that apply to coding? What does it mean if the student thoughtfully writes a detailed prompt and then uses a coding agent to produce code for a statistical analysis? </p><p>Some peer-reviewed journals have asked for reporting of every line of code that is generated by AI. This would be incredibly complex to account for if you are using a coding agent or assistant. For much of my code I write the first few letters, then the AI takes over for the rest of the line. </p><p>To help move things along, here&#8217;s an initial list of potential uses, ordered by level of appropriateness for undergraduate studies (and thanks to my colleagues for helping develop this). My colleagues and I are developing a pro-forma for student assignments so they can tick off their uses and explain further if necessary. This covers writing, research and coding. </p><p>Comment if you have other uses I&#8217;ve missed! </p><p>I think I&#8217;m going to start using the same proforma for paper submissions, as it clears up uses a lot. </p><p>I&#8217;ve divided it into: (1) &#8216;allowed uses no further explanation needed&#8217; the student just checks a box, no further explanation necessary for these. (2) &#8216;allowed uses, need more explanation&#8217; the student needs to write some explanation of what they did. (3) Uses not allowed. </p><p>The division into these three categories depends on the class and the learning goals, so they are just a rough guide. </p><h2>Research and writing</h2><h3>Allowed uses, no further explanation needed</h3><ul><li><p>Asked for advice on how to improve my writing</p></li><li><p>Brainstorming</p></li><li><p>Creating practice exams</p></li><li><p>Creating summaries of text for me to read</p></li><li><p>Translating materials to another language for me to read</p></li><li><p>Used AI powered search engines, but then read source material myself</p><p></p></li></ul><h3>Allowed uses, need more explanation</h3><ul><li><p>Used grammar editors or AI powered autocomplete to help me write</p></li><li><p>Used to suggest a structure for my assignment</p></li><li><p>Used to re-write parts of my assignment</p></li><li><p>Used to find references and attribute them in text</p></li><li><p>Created summaries of main sources that I then referred to write my assignment</p></li><li><p>Attached reference documents to my prompts to help with writing</p></li><li><p>Used to generate graphics and visualizations</p></li><li><p>Used AI powered search engines, but then read source material myself</p></li></ul><p></p><h3>Uses not allowed</h3><ul><li><p>Used for writing parts of the assignment</p></li><li><p>Used to write code that generates data</p></li><li><p>Used to generate figures, including data figures</p></li><li><p>AI tools were the only tools used to identify references</p></li><li><p>Asked AI to complete the assessment task based on materials provided by the lecturer</p></li><li><p>Used AI to paraphrase other text for use in my assignment</p></li></ul><p></p><h2>Coding</h2><h3>Allowed uses, no further explanation needed</h3><ul><li><p>Asked for tips on fixing bugs</p></li><li><p>Used deep research to help me find online tutorials</p></li><li><p>Created tutorials for coding</p></li><li><p>Used to explain code </p></li></ul><h3>Allowed uses, need more explanation</h3><ul><li><p>Used an AI autocomplete feature</p></li><li><p>Used to write code based on my own carefully worded instructions</p></li><li><p>Used prompts that included reference documents to write code </p></li><li><p>Used to help improve prompts before using those prompts to write code</p></li><li><p>Used to make an infographic about a methodology or protocol</p></li></ul><h3>Uses not allowed</h3><ul><li><p>Used to complete the assignment based on attaching materials provided by the lecturer</p></li></ul><p></p><p></p><p></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[AI tools and the social part of socio-ecological research]]></title><description><![CDATA[Hi there.]]></description><link>https://vitaexmachina.substack.com/p/ai-tools-and-the-social-part-of-socio</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/ai-tools-and-the-social-part-of-socio</guid><dc:creator><![CDATA[Carla Sbrocchi]]></dc:creator><pubDate>Fri, 20 Feb 2026 00:49:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<blockquote><p>Hi there. I&#8217;m Carla, 1/4 of the Vita ex Machina. Unlike the other 3/4 of the Vita crew, I&#8217;m not an early adopter of digital tools, I&#8217;m no expert in R and I&#8217;m a novice dabbler when it comes to AI tools. <a href="https://www.mentalfloss.com/article/518759/6-priceless-documents-reveal-key-moments-early-einsteins-career">I have no special talent, but I am passionately curious</a>.</p><p>Given the chatter about the potential for AI tools to transform research, I&#8217;ve been interested in how AI tools are being tested and applied in the fields of research in which I work - social sciences and marine management.</p><p>Social sciences, if you&#8217;re not familiar, are used to study patterns in human systems and how people make meaning of their surroundings - what&#8217;s important to them, and what are the underlying mechanisms that contribute to their life experiences and settings.</p><p>It&#8217;s these &#8216;underlying mechanisms&#8217; that are causing some strife in the social science research world when it comes to AI tools, particularly GenAI. Social scientists are wary that the use of these tools <a href="https://www.sciencedirect.com/science/article/pii/S2949882125000295#:~:text=Among%20the%20myriads%20of%20harmful,Connor%20&amp;%20Liu%2C%202023).">perpetuates existing biases</a><strong> </strong>that prejudice or privilege certain information or perspectives, affecting, for example, how programs are delivered or the types of information that are selected. Social scientists are also concerned that <a href="https://www.nature.com/articles/d41586-026-00221-8">chat bots are responding to surveys</a>, a common research tool in our field, and giving false or misleading representations of human participants.</p><p>On the other hand, there are real opportunities to incorporate technologies like GenAI in our research tool kit. <a href="https://www.cell.com/cell-reports-sustainability/fulltext/S2949-7906(24)00207-6">LLMs are great at systematic literature reviews</a>, <a href="https://www.undp.org/acceleratorlabs/blog/visualizing-future-artificial-intelligence-climate-action">image generation tools can help stakeholders visually forecast future landscapes</a> and can help <a href="https://www.sciencedirect.com/science/article/pii/S2468502X24000160">researchers visualise their data</a>. AI tools are <a href="https://journals.sagepub.com/doi/abs/10.1177/10778004251412871">becoming better at categorising qualitative data</a> (what social scientists call coding, but is different to how computer scientists do coding) such that some people in the field think it will completely revolutionise social science research methods.</p><p>The world is still in the early days of AI tool development, and everyone is learning along the way. Tools and capabilities will change rapidly in coming years. Tests of these tools and capabilities will reveal strengths and weaknesses, allowing people to choose a given tool for a given task. <a href="https://openai.com/index/ai-safety-needs-social-scientists/">There is broad recognition that social sciences are critical</a> (in both senses of the word - important and evaluative) in this space, to ensure safety measures are incorporated. As a social scientist, I&#8217;m watching this area with interest and will share what I learn that can contribute to the world of conservation management sciences. If you&#8217;re also interested in this stuff - let&#8217;s connect!</p></blockquote><div><hr></div><blockquote></blockquote><div><hr></div><h4><strong>Subscribe to Carla Sbrocchi</strong></h4><p>Launched 17 hours ago</p><p>Marine social scientist by day</p><p>By subscribing, you agree Substack&#8217;s <a href="https://substack.com/tos">Terms of Use</a>, and acknowledge its <a href="https://substack.com/ccpa#personal-data-collected">Information Collection Notice</a> and <a href="https://substack.com/privacy">Privacy Policy</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://vitaexmachina.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Vita ex machina! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Ask an LLM to pre-review your manuscript before submission]]></title><description><![CDATA[Here's the prompt we use to get the LLM to be 'reviewer 2']]></description><link>https://vitaexmachina.substack.com/p/ask-an-llm-to-pre-review-your-manuscript</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/ask-an-llm-to-pre-review-your-manuscript</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Thu, 19 Feb 2026 00:20:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Getting an LLM to give very critical feedback on your writing is one way to learn to write better. We find this prompt helpful to run towards the end of manuscript completion, just before it is submitted for peer-review. </p><p>Recommend using a Claude model over a GPT model for this, because GPT tends to be more sycophantic. With this LLM pre-review you want it to be as critical as possible, so you can iron out issues before real peer-review. </p><p>It is important to try get the LLM to think this is not your own work - they tend to be more forgiving and agreeable if they think they are looking at your work. Remove author names from the manuscript and turn off memory features to prevent it guessing this is your own work (e.g. in copilot use &#8216;Temporary chat&#8217;). </p><p>Here&#8217;s the prompt: </p><p><code>I want to you to peer-review this manuscript that has been sent to me.</code></p><p><code>Take the role of Reviewer 2, the most critical peer-reviewer who will suggest a multitude of ways to improve the manuscript. As reviewer 2 you:</code></p><ul><li><p><code>Fundamentally object to the study&#8217;s main thesis.</code></p></li><li><p><code>Take exception to the experimental design. Point out how the sampling is insufficient and the analysis or design is confounded.</code></p></li><li><p><code>Question the relevance of modelling analyses to the real world and point out missing sensitivity analyses.</code></p></li><li><p><code>Are a seasoned expert in the topic who is pedantic about methodological rigor and attention to detail.</code></p></li><li><p><code>Criticise the manuscript&#8217;s lack of novelty and lack of impact on the discipline.</code></p></li><li><p><code>Are unforgiving about gaps in the literature.</code></p></li></ul><p><code>&#8211; Ruthlessly point out misleading narratives, confirmation bias, sycophantic tones, systemic biases, and factual inaccuracies.</code></p><ul><li><p><code>You are unrelenting in identifying and criticising duplication and inconsistent use of terms.</code></p></li></ul><p></p><p>Obviously check any references it suggests yourself. I wrote this for me (I do modelling) but you can tweak for your discipline. </p><p>Here&#8217;s an part of a response I got for one manuscript that was at final draft:</p><p>&#8220; While the dataset is substantial and the research question timely, the manuscript suffers from fundamental conceptual issues, methodological limitations, and incomplete execution that severely undermine its contributions. Estimated time for revisions 6-12 months.&#8220;</p><p>It won&#8217;t really take me 6 months to revise it, but the advice it gave in that review was helpful for clarifying my thinking and reader misunderstandings. </p>]]></content:encoded></item><item><title><![CDATA[LLMs + R resource guide]]></title><description><![CDATA[February 2026 update]]></description><link>https://vitaexmachina.substack.com/p/llms-r-resource-guide</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/llms-r-resource-guide</guid><dc:creator><![CDATA[Luis D. Verde Arregoitia]]></dc:creator><pubDate>Tue, 17 Feb 2026 02:45:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!pJ_U!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>For the past year or so I&#8217;ve tried my best to keep this online guide to tools, packages, and resources for working with LLMs in R up to date.  This month I added 8 more packages and an IDE extension.</p><p>Check it out here: <strong><a href="https://luisdva.github.io/llmsr-book/">Large Language Model tools for R</a></strong></p><h2>R packages</h2><ul><li><p>Interactive code review with automated suggestions (<code>reviewer</code>)</p></li><li><p>predictive modeling assistant built on tidymodels (<code>predictive</code>)</p></li><li><p>AI coding agent CLI written entirely in R (<code>llamaR</code>)</p></li><li><p>composable framework for building prompts with validation (<code>tidyprompt</code>)</p></li><li><p>ReAct architecture for automated data analysis (<code>llmflow</code>)</p></li><li><p>native R access to Hugging Face models and datasets (<code>huggingfaceR</code>)</p></li><li><p>speech-to-text with Whisper in pure R (<code>whisper</code>)</p></li><li><p>text-to-speech using torch without Python dependencies (<code>chatterbox</code>)</p></li></ul><h4>Other tools and extensions</h4><ul><li><p><strong>Gemini CLI companion</strong> - open-vsx extension with enhanced IDE integration</p></li></ul><p></p><p>For those unfamiliar with the guide, it is a Quarto online book available in English and Spanish (thanks to the babelquarto package) where I collect all the tools and reading materials I come across. Lately people have also been been reaching out with new developments or items I may have missed. </p><p>The landing page features this nice interactive tile of logos for the different tools, made with my hexsession package.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pJ_U!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pJ_U!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 424w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 848w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 1272w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pJ_U!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png" width="1204" height="1046" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1046,&quot;width&quot;:1204,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1046992,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/188213250?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pJ_U!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 424w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 848w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 1272w, https://substackcdn.com/image/fetch/$s_!pJ_U!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44bb325a-927e-4138-beb8-f484e912ec18_1204x1046.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[Signalling credible publications to editors]]></title><description><![CDATA[Generative AI is removing the writing barrier to publication.]]></description><link>https://vitaexmachina.substack.com/p/signalling-credible-publications</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/signalling-credible-publications</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Wed, 11 Feb 2026 21:50:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Generative AI is removing the writing barrier to publication. Quality writing was one of the signals that editors and reviewers used to differentiate high quality from low quality work. But now its easy to produce low quality work that is well written.</p><p>As AI slop becomes more ubiquitous the publication system will need to look for new signals for quality research. The quality of the data may be one signal.</p><p>Author names and status may be another.</p><p>News from the cybersecurity world provides an interesting hint as to how the future of publication may look.</p><p>Many widely open source software packages allow for anyone to flag security flaws. Sometimes they offer bounties or rewards for identifying flaws. But genAI coding agents have flooded these projects with submissions, most of them AI slop. <a href="https://www.theregister.com/2026/01/21/curl_ends_bug_bounty/">This creates a huge burden on the software maintainers to manage and filter through those submissions</a>.</p><p><a href="https://simonwillison.net/2026/Feb/7/vouch/">A recently proposed system for &#8216;vouching&#8217;</a> allows the software maintainers to create a list of verified contributors. It effectively creates a list of allowed contributors to an open software project. If you aren&#8217;t &#8216;vouched&#8217; you are blocked from contributing.</p><p>The issue with submissions on software security is much like the trends we are hearing about at peer-reviewed journals of increasing numbers of well-written but vacuous studies.</p><p>So maybe in the near future we&#8217;ll have to get our names on a white list before we can submit to reputable journals.</p>]]></content:encoded></item><item><title><![CDATA[Don't use AI generated photos in scientific presentations]]></title><description><![CDATA[It will harm you credibility, use cartoons instead]]></description><link>https://vitaexmachina.substack.com/p/dont-use-ai-generated-photos-in-scientific</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/dont-use-ai-generated-photos-in-scientific</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Fri, 06 Feb 2026 20:30:22 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MznX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Image generation models are powerful tools for <a href="https://vitaexmachina.substack.com/p/tips-on-making-science-infographics">science communication</a>. But generative AI also (rightfully) has a reputation for hallucinations and making stuff up. </p><p>If you use image generations this means your science and your reputation could come to be associated with making stuff up. This is not a good look for your science career. </p><p>One early career researcher I saw recently gave a presentation with AI generated photos. They didn&#8217;t declare this, but the images clearly were AI generated because they contained mistakes that were obvious to anyone who knew the ecosystems pictured.</p><p>Just to drive home my point, imagine if you saw this in a scientific presentation (without the declaration):</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MznX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MznX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!MznX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!MznX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!MznX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MznX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:401465,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/186654829?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MznX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!MznX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!MznX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!MznX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb1c679b2-bb8d-4e3c-95c4-41745dd8f77f_1024x1024.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">In the real world there are no penguins in the Arctic and there are no polar bears in Antarctica.</figcaption></figure></div><p>It would leave the audience doubting the rest of the presentation. Were the graphs made with real data or AI generated? Is the data real? What can we trust?  </p><p>So a few tips on using image generation models in science settings, especially in presentations:</p><ol><li><p>Declare genAI use</p></li><li><p>Use cartoons not photo realistic images</p></li><li><p>Use reference images to get accurate representations</p></li></ol><h2>1. Declare genAI use</h2><p>Be honest, its good for your integrity. I&#8217;ve been experimenting with different ways of declaring AI use. Some people are 100% opposed to AI use and won&#8217;t like seeing your AI images. But better to be honest, they&#8217;d be more upset if they found out later the images were AI. They&#8217;ll probably guess anyway. </p><p>Other people will appreciate the honesty. </p><p>Norms around AI use are still developing, so be a pioneer in best practice and declare what you are doing. </p><p>I&#8217;ve also been adding to my images: &#8220;AI generated, human verified&#8221;. Where an image contains facts and figures (like an <a href="https://vitaexmachina.substack.com/p/tips-on-making-science-infographics">infographic</a>) its important to let the audience know that you&#8217;ve checked the figures are accurate. </p><h2>2. Use cartoons, not photo realistic images</h2><p>Cartoons are clearly fake. Making a cartoon isn&#8217;t making up fake data, whereas an AI photo is very close to fake data. </p><p>If I wanted to make a joke in my presentation about common misunderstandings of species biogeography I could use an image like this: </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CP1e!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CP1e!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CP1e!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:679012,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://vitaexmachina.substack.com/i/186654829?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CP1e!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!CP1e!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F243324f1-19c8-41d5-a14f-6d48d2c81f17_1024x1024.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Clearly fake image of polar bear chasing penguins. The cute and cheerful animals belie a harsh truth, if bears did manage to swim to Antarctica there wouldn&#8217;t be penguins there for much longer. </figcaption></figure></div><p>I also ask the AI to use cartoons or different drawing styles for infographics. For example try &#8216;Make the image in watercolor style&#8217; or &#8216;in line art botanical style&#8217;. You can be creative with different styles. </p><h2>3. Use reference images </h2><p>AI image models like Google&#8217;s Nano Banana can take a reference image and re-use that in the image. For instance, when I was making an infographic about Tasmanian kelp forests the AI kept using <a href="https://www.seascapemodels.org/posts/2025-11-28-tips-making-infographics-with-ai/">North American fish</a>. This wasn&#8217;t great for my credibility as a Tasmanian ecologist. </p><p>I solved the problem by attaching photos of Tasmanian fish. The AI was able to redraw these in the style I wanted (watercolor) and the species look truer to the ecosystem I was trying to represent. </p><p></p><p>To wrap up, the way you deliver your presentations impacts your credibility as a scientist. Norms around AI use are still developing, so its best to be honest: Declare AI use, use cartoons and use reference images to get the biota right. </p><p></p>]]></content:encoded></item><item><title><![CDATA[Online book on coding in R with AI assistants]]></title><description><![CDATA[I&#8217;ve just updated my online resource, AI Assistants for Scientific Coding. It&#8217;s a practical guide to using language models to support scientific computing and analysis.]]></description><link>https://vitaexmachina.substack.com/p/online-book-on-coding-in-r-with-ai</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/online-book-on-coding-in-r-with-ai</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Tue, 03 Feb 2026 19:57:06 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I&#8217;ve just updated my online resource, <a href="https://www.seascapemodels.org/AI-assistants-for-scientific-coding/">AI Assistants for Scientific Coding</a>. It&#8217;s a practical guide to using language models to support scientific computing and analysis.</p><p>The book focuses on helping people who already use R for data analysis. It&#8217;s not an introduction to programming; instead, it shows how to work with AI tools effectively and responsibly once you know the basics. </p><p>What you&#8217;ll find inside:</p><ul><li><p>Choosing and using AI coding assistants, from simple chat tools to agents that can run and test code</p></li><li><p>Prompting strategies that improve reliability for real analysis tasks</p></li><li><p>Examples from environmental science (GLMs, multivariate stats), with methods general to other fields</p></li><li><p>Notes on ethics, copyright, costs, and environmental impacts</p></li></ul><p>The material also serves as reference notes for a one&#8209;day workshop and will evolve as the field changes. If you&#8217;re interested in the prompting side of statistical workflows, there&#8217;s an <a href="https://doi.org/10.32942/X2CS80">accompanying preprint</a>. </p>]]></content:encoded></item><item><title><![CDATA[Should we still teach R coding in this age of genAI?]]></title><description><![CDATA[I often get asked if we should still be teaching coding skills to students and researchers now that we have generative AI tools that can write code for us.]]></description><link>https://vitaexmachina.substack.com/p/should-we-still-teach-r-coding-in</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/should-we-still-teach-r-coding-in</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Tue, 03 Feb 2026 19:10:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Qt-g!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F347223a6-d641-4278-95bd-31fe6ab7f1a3_702x702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I often get asked if we should still be teaching coding skills to students and researchers now that we have generative AI tools that can write code for us.</p><p>The general consensus is yes, we should still teach coding. For an ecology specific <a href="https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/2041-210X.14325">argument see here</a>.</p><p>Teaching coding is teaching people how to think logically and structure problems. These skills are essential not only in coding, but also for good science and problem solving in general.</p><p>So you need to learn to code to learn to think logically and structure problems.</p><p>A more practical standpoint is that LLMs are good at generating code when they have seen a lot of examples of that code their data. But they are not so good at creative work on novel topics or niche areas.</p><p>For example, AI coding assistants struggle to make a simple <code>tmap</code> (the R mapping package) work well. <code>tmap</code> was updated recently, meaning the examples in the LLM&#8217;s training are out of date.</p><p>The coding assistants also seem to default to code patterns they are most familiar with, like <code>ggplot2</code> syntax. Not all of this works with <code>tmap</code>.</p><p>In general, I find they perform much better at statistical modelling in R than they do with complex geospatial analyses.</p><p>This practical issue may increasingly become less relevant as LLMs get better and we create better resources to inform their actions (like a tmap specific guide for LLMs to read before advising you, or telling your LLM to do a web search of the tmap page).</p><p>But it is still likely that frontier ecological modelling will require deep human engagement with code. So learning to code is important for researchers.</p><p>I&#8217;ll leave the final say to Andrej Karpathy, former head of AI at Tesla and founding member of OpenAI&#8217;s research group.</p><p>In a <a href="https://www.dwarkesh.com/p/andrej-karpathy">recent podcast interview</a> he talks about a repo he made to help teach people how to work with LLMs. His advice to learners is literally to re-write his repo by hand, not even to cut and paste.</p><p>He goes on to explain that AI agents are good for very standard code or code that is common on the internet. But they perform poorly for creating new code, or code that contradicts common patterns.</p><p>For any research code he most commonly uses the code auto-complete features that AI assistants have, rather than AI agents.</p><p>He suggests that to really learn something you should code it yourself, using examples only as references.</p><p>Karpathy came up with the term &#8216;vibe-coding&#8217; which means creating code by prompting, without reviewing the code. Usually an AI agent is used. So its saying a lot if he thinks that learning means doing it the old fashioned way of writing code yourself.</p>]]></content:encoded></item><item><title><![CDATA[Tips on making science infographics with AI and nano banana]]></title><description><![CDATA[Communicate your science to a broader audience]]></description><link>https://vitaexmachina.substack.com/p/tips-on-making-science-infographics</link><guid isPermaLink="false">https://vitaexmachina.substack.com/p/tips-on-making-science-infographics</guid><dc:creator><![CDATA[Chris Brown]]></dc:creator><pubDate>Mon, 02 Feb 2026 19:56:48 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zgvB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I&#8217;ve been playing around with <a href="https://www.seascapemodels.org/posts/2025-11-23-exploring-gemini-3-image-generation/">Google Gemini Pro 3 (AKA Nano Banana)</a> as a way to make science infographics. Here are the key lessons from my attempts.</p><p>For the record I&#8217;d still prefer to work with a human to create great infographics. A good designed will give you much more than just create the image, they will help you think through the key points and how most accurately to represent those. You&#8217;ll see below that I had some issues with scientific accuracy of the AI generated images. But I don&#8217;t have the budget for a designer on every project. AI images are good if you are clear on the key points you want to communicate.</p><p>The best image generating model requires a monthly subscription fee. However, there is a way around the subscription so you can pay per image (about USD$1 per image). I&#8217;m accessing the model via API calls. This is a bit more technical, but much cheaper for the occasional user (like me). Instructions with Python code are at the end of this post. </p><h2>Be clear on the key points of your study</h2><p>Like any science communication its important to be clear on the key points that you want to communicate from the outset. Then tell the those key points in your prompt.</p><p>But, another approach is&#8230;</p><h2>See what the AI comes up with</h2><p>Just point the AI to your study using a web search. It is pretty good at simultaneously interpreting the key points from the abstract and then generated an image to go along with that. Here&#8217;s my first attempt:</p><pre><code><code>Create a two panel infographic in cartoon style showing fisheries species that benefit from giant kelp restoration. In the left panel show small Ecklonia plants. In the right panel show giant kelp. Base the infographic on the recent study 'Predicting the impact of giant kelp restoration on food webs and fisheries production'. Include the 40x productivity boost. But don't include the numbers for the fishery species, just show which ones go up. Use bright colors and clear labels.</code></code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zgvB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zgvB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zgvB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg" width="1200" height="896" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:896,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zgvB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 424w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 848w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!zgvB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa6a0def8-990e-4aa4-b6ed-c2df5e12db1a_1200x896.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>With the web search feature enabled the model can find the article via the title I provided in the prompt.</p><p>This looks pretty nice for a very simple prompt and the &#8216;40x&#8217; stat is accurate to the paper.</p><p>However on closer inspection there are a few issues.</p><p>The fish look North American, our study was in Tasmania. The abalone are upside down.</p><p>The biggest issue is that the image implies there was a 40X boost in productivity to fisheries, which is not what we found. <a href="https://onlinelibrary.wiley.com/doi/10.1002/aqc.70242">We found a 40x boost in kelp productivity that translated into 1-7% boost for fisheries</a>.</p><h2>Name a style</h2><p>Its fun to play around with different styles. For the infographics below I tried &#8216;water colour style&#8217; and &#8216;line-art botantical style&#8217;. You can get ideas for styles from lists on the web or a chatbot.</p><p>I don&#8217;t want the images to be too much like photos, I want it to be obvious that they are not &#8216;real&#8217;.</p><p>Some ideas for styles to mention:</p><ul><li><p>Cartoon (AI will choose its own style)</p></li><li><p>Watercolour</p></li><li><p>Botanical line-art</p></li><li><p>Like &#8216;The Simpsons&#8217; or your favourite cartoon. Just add &#8220;Don&#8217;t include any proprietary characters in the image&#8221; to avoid it using images that might violate copyright.</p></li><li><p>Gothic</p></li><li><p>Pencil sketch</p></li><li><p>Manga</p></li></ul><p>Or anything that is common enough the AI will know it.</p><h2>Include reference images</h2><p>Next attempt I tried a water colour style and I attached reference images of Tasmanian reef fauna to the prompt:</p><pre><code><code>Create a two panel infographic in water colour style showing fisheries species that benefit from giant kelp restoration. In the left panel show a rocky reef with dense but small Ecklonia plants. In the right panel show a rocky reef with a healthy giant kelp forest. Include on the right panel the text: 40x kelp productivity boost. Include the animals in the reference images, showing more of them in the right panel. Redraw the animals to match the water colour style and so they look a natural part of the scene, but keep their essential morphological features intact. </code></code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!M2WX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!M2WX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!M2WX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg" width="1152" height="928" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:928,&quot;width&quot;:1152,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!M2WX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!M2WX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5bcd136-ea86-43e2-8def-f4e212e48c04_1152x928.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This looks more like a Tasmanian reef and is clearer that the 40x applies to kelp. But unfortunately the abalone are still upside down - I accidentally included an image of the abalone foot in my reference list.</p><p>The image also lacks context, so I wanted to try add some text in the watercolour style:</p><pre><code><code>Update this infographic to add a sub-panel to the left hand panel that reads: 'Giant kelp forests have disappeared in many places, but are more productive than other algae. O'Neill et al. used models to predict how giant kelp restoration can enhance fisheries'. Adjust the composition so all text is clearly readable and the images are arranged in a balanced, visually appealing layout.</code></code></pre><p>I also corrected the reference image for abalone to only show the shell.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!88TX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!88TX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!88TX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!88TX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!88TX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!88TX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg" width="1152" height="928" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:928,&quot;width&quot;:1152,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!88TX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 424w, https://substackcdn.com/image/fetch/$s_!88TX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 848w, https://substackcdn.com/image/fetch/$s_!88TX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!88TX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb521aebc-832f-414d-b04c-93c02dc123cc_1152x928.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The coauthors and I were happy with this version. I then just used powerpoint to add the DOI to the study as well as a statement: &#8220;Image: AI generated, human verified&#8221;.</p><p>I think people like to know what is generated by AI, but I wanted my audience to know that I had checked it for factual accuracy.</p><h2>The more text the more likely there are to be errors</h2><p>I had one more attempt with <a href="http://doi.org/10.1111/rec.70261">a study about mangrove restoration</a>.</p><p>For this one I had the prompt:</p><pre><code><code>Create an infographic in line-art botanical style explaining the following study for a general audience: ...</code></code></pre><p>Then I included the entire abstract in the prompt.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!R2BU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!R2BU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!R2BU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!R2BU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 424w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 848w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!R2BU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc257a227-5dd3-47f3-9d3a-6b1c912c6d18_1024x1024.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If you look closely you will notice lots of errors in the text.</p><h2>Prompting with the study abstract didn&#8217;t work as well</h2><p>The infographic based on the abstract was also too detailed for what I wanted. It would be better to go back to tip 1 (be clear on study outcomes) and just provide the AI with 1-3 key points to represent.</p><h2>Leave space for post-production modifications</h2><p>I&#8217;d like to be able to modify these easily by adding text boxes in powerpoint (I&#8217;m not sophisticated with image editing!).</p><p>My final tip is to ask the AI to leave a &#8216;white box&#8217; in a certain position, that way you can add text to it later, making sure it is accurate. Like:</p><pre><code><code>Create an infographic in the style of ... Leave a white box in the bottom right hand corner that is 1/3 of the image wide and 1/5 of the image tall. </code></code></pre><h2>Example python code</h2><p>See the <a href="https://ai.google.dev/gemini-api/docs/image-generation">Google API reference for image generation</a>.</p><p>Signing up for AI studio with google is a bit of a pain and I can&#8217;t give you instructions because I&#8217;m not sure how I did it! Just try follow their instructions and eventually you&#8217;ll get it right. They also give you tonnes of free credits, so I haven&#8217;t paid for an image yet (though I think you might have to buy some credits in your account to get the free credits). </p><p>Once that is set-up you will need to get a copy of the API key (like your password) and then save it as an environment variable. Then I use Python from VScode as below. </p><p>Here&#8217;s the code I modified from there that uses reference images:</p><pre><code><code>from google import genai
from google.genai import types
import os
from PIL import Image

from dotenv import load_dotenv
load_dotenv()

api_key = os.getenv("GEMINI_API_KEY")


prompt = "Create a two panel infographic in water colour style showing fisheries species that benefit from giant kelp restoration. In the left panel show a rocky reef with dense but small Ecklonia plants. In the right panel show a rocky reef with a healthy giant kelp forest. Include on the right panel the text: 40x kelp productivity boost. Include the animals in the reference images, showing more of them in the right panel. Redraw the animals to match the water colour style and so they look a natural part of the scene, but keep their essential morphological features intact. "
aspect_ratio = "5:4" # "1:1","2:3","3:2","3:4","4:3","4:5","5:4","9:16","16:9","21:9"
resolution = "1K" # "1K", "2K", "4K"

client = genai.Client()

response = client.models.generate_content(
    model="gemini-3-pro-image-preview",
    contents=[
        prompt,
        Image.open('gemini-api/reef-images/abalone.png'),
        Image.open('gemini-api/reef-images/crab.png'),
        Image.open('gemini-api/reef-images/ecklonia.jpg'),
        Image.open('gemini-api/reef-images/lobster.png'),
        Image.open('gemini-api/reef-images/trumpeter.png'),
        Image.open('gemini-api/reef-images/wrasse.png')
    ],
    config=types.GenerateContentConfig(
        response_modalities=['TEXT', 'IMAGE'],
        image_config=types.ImageConfig(
            aspect_ratio=aspect_ratio,
            image_size=resolution
        ),
    )
)

for part in response.parts:
    if part.text is not None:
        print(part.text)
    elif image:= part.as_image():
        image.save("gemini-api/reef-images/reef-scene3.png")

</code></code></pre>]]></content:encoded></item></channel></rss>