- Goals
- Motivation
- Shift in my perspective on AI
- Key takeaways
- Experiment
- Success criteria
- Hypothesis
- Results
- Next step: custom prompts
- Methodology
- Reading from Notion
- Prompting Claude
- Writing back to Notion
- Appendix
- Debugging MCP
Goals
- Improve my writing and editing workflow, with emphasis on editing. While I'm comfortable with capturing thoughts as initial drafts, refining content into polished final drafts is my biggest area for improvement.
- Gain practical experience, develop perspective on the capabilities and limitations of AI tools as discussed later.
Motivation
My experience with both Notion AI and Claude for editorial support on my personal writing has been positive. While Notion AI offers seamless integration with Notion, its tool-specific nature makes the $8/month pricing seem less compelling compared to more versatile tools like Claude being cost-free with usage limits or $20/month for less restrictions. Claude also offers broader utility beyond Notion. The Hardfork podcast discusses some…interesting emerging use cases. But it lacks native Notion integration, so copy-pasting between tools is required.
Shift in my perspective on AI
I became less skeptical of the AI tool wave after listening to the podcast episode, AI tools for software engineers, but without the hype – with Simon Willison (co-creator of Django) [available on Spotify or directly on The Pragmatic Engineer newsletter]. As the co-creator of Django - a framework popular for enabling rapid development - and a heavy AI user, Willson comes across as highly informed yet approachable. Even though I read a decent amount about how others are using AI, this actually inspired me to experiment more actively, such as this very exercise.
Key takeaways
- Actively engage: Everyone should experiment with AI tools to understand their practical value.
- Take a pragmatic approach: While ethical concerns about unlicensed training data and environmental impact are valid, the genie is out of the bottle. If you aren’t at least testing, you diminish your odds of adapting to a changing landscape
- Don’t write it off: Even if you’ve tried before, try again. Tool capabilities evolve rapidly. Also, getting good with new tools takes experimentation, trial and error, and patience.
- Don’t be intimidated: Claude has good documentation that is generally applicable to other LLM models. There are communities sharing knowledge. Getting started is getting easier.
- Play with multiple models: Current leading options include Claude 3.5 Sonnet and GPT-4, each with strengths in different tasks.
Experiment
Anthropic, the company behind Claude, released the Model Context Protocol (MCP) recently. Essentially, it allows users to provide their own backends for retrieving data to load into a chat context. Someone has already written a Notion server for the protocol, mcp-notion-server, accompanied by a blog post, Operating Notion via Claude Desktop Using MCP!. I focused on Notion since it's one of my primary writing and sharing tools, but there are also official and open source MCP servers for Obsian, Apple Notes, and others.
Success criteria
I would consider this experiment if I am able to
- Integrate Claude directly within Notion, playing nicely with the latter’s block-based document structure.
- Alternatively, enable Claude to read and update Notion pages in non-corrupting ways.
Hypothesis
I’m assuming, based on prior forays into AI-assisted proof reading, that the quality of my writing will be improved if I pass them through an LLM. So I believe that if friction to using the tools is reduced, I’m more likely to use them and therefore publish higher quality things and publish more frequently.
Results
I was able to achieve the second of my success criteria, “enable Claude to read and update Notion pages in non-corrupting ways.”
While not matching Notion AI's native integration, using MCP to bridge Claude and Notion significantly reduces friction compared to manual copy-pasting. The workflow produced valuable editorial feedback that I could selectively implement.
I did experience issues with things like images, links, and tables being dropped when going from Claude back to Notion. When I asked Claude if those changes were intentional or incidental, it did offer to add them back. But they were appended to the end of the doc as opposed to replaced inline.
Next step: custom prompts
Goal: Create prompts that better preserve my personal writing style while still improving clarity.
Between the model and the prompt, there is a push towards conciseness. That has been a could net positive, given my verbose writing style, but it often does go too far. I don’t have a desire to be maximally concise because some of that wordiness and funky word choice is part of who I am, and not objectively a wrinkle to smooth out.
Methodology
Reading from Notion
As a test, I used a 2000+ word draft that I’ve been musing about for a while.
After some setup and debugging, Claude was able to successfully load it into context.
Prompting Claude
I used one of the pre-canned prompts provided by Claude, Prose polisher.
While I didn’t agree with all of the feedback, it was constructive, largely actionable, and there were suggestions that I did take.
Writing back to Notion
Original version’s word count: 2041. Edited version’s word count: 836.
Rather than making destructive changes to my original draft, which I’d called out as the preferred workflow, I found it best to either write to a new Notion page or append the edited version to the same page. That allows me to read through and selectively (and correctly) apply suggestions. For instance, when Claude tries to move a section, it may delete the original section header, leaving the rest of the content behind, then duplicate the section in the new location.
Appendix
Debugging MCP
Initially, I kept getting {“error”: fetch is not defined}
when calling my MCP server. I suspect this has something to do with the fact that Node 16 was the latest I had installed when I started, and the fetch
API was made generally available in Node 21. Even after I installed the latest LTS version (22.12 at the time of writing) and set it as the default, I was still running into this `fetch is not defined` issue.
The fix was to
- Use a local Node build rather than NPX.
- This allowed me to debug by doing things like logging `process.version`. Those logs could then be viewed in ~/Library/Logs/Claude.
- Pro tip: Using the MCP Inspector made the debugging loop much faster (and cheaper) than directly testing in the Claude Desktop chat interface. It also allowed me to verify that the calls to Notion’s API were in fact succeeding.
- Set absolute path to Node, even though I’m using NVM.
- Set absolute path to local build.
My config ended up looking like this:
{
"mcpServers": {
"notion": {
"command": "/Users/yourname/.nvm/versions/node/v22.12.0/bin/node",
"args": ["/Users/yourname/projects/mcp-notion-server/notion/build"],
"env": {
"NOTION_API_TOKEN": “<token>”
}
}
}
}