Notion Updates 2026: New Features and How to Automate Them Beyond Notion

M
Matthew Diakonov
10 min read

Notion has shipped a lot in 2026. Dashboards, AI agents that run for 20 minutes autonomously, presentation mode, a redesigned sidebar, Workers for custom code execution, and performance improvements across the board. Every other article covering these updates tells you what changed inside Notion. This one is about what you can do with those features outside of Notion, using your entire Mac desktop as the automation surface.

4.9from 500+ Mac users
Free & open source
Works offline
No API keys

1. What Notion actually shipped in 2026

Here is a condensed list of the features that matter most, drawn from Notion's official release notes and third-party trackers like Matthias Frank's changelog:

  • Dashboards (Notion 3.4) - Aggregated views across databases with charts, metrics, and filters. Replaces the old "linked database + gallery" workaround.
  • AI Agents - Autonomous agents that run for up to 20 minutes, working across Slack, Jira, Google Drive, and internal Notion databases. Multi-model support including Claude, GPT-5, and Gemini 3 Pro.
  • Presentation mode - Turn any Notion page into slides without exporting to PowerPoint or Google Slides.
  • Notion Workers - Custom code execution sandboxes that let agents run JavaScript within Notion. Think serverless functions scoped to your workspace.
  • Redesigned sidebar - Collapsible sections, faster navigation, better keyboard shortcuts.
  • Meeting notes with live transcription - Background audio capture with AI summaries, action item extraction, and auto-linking to relevant pages.
  • Performance improvements - 27% faster on Windows, 11% faster on Mac according to Notion's benchmarks.
  • Map View and Feed View - New database view types for location data and chronological content.
  • Notion Mail on mobile - Email client integrated with your workspace, now available on iOS and Android.

These are real improvements. But they all operate within Notion's walls. The question this guide answers is: what happens when you need a workflow that starts in Notion and touches the rest of your computer?

2. Where Notion's own automations stop

Notion's internal automations (database triggers, AI agents, Workers) are powerful within their scope. But that scope has clear boundaries:

They cannot control other desktop apps

A Notion automation can update a database row or send a Slack message. It cannot open a file in Finder, run a terminal command, fill out a form in another app, or interact with any native macOS application.

They run in Notion's sandbox

Workers execute JavaScript in an isolated environment. They cannot access your filesystem, your clipboard, your other browser tabs, or any local service. This is a security feature, but it limits what workflows can do.

They cannot read the UI of other tools

Notion's integrations pull data via APIs (Jira, GitHub, Slack). But many tools you use daily have no API, or the API does not expose what you need. You end up copying and pasting manually.

This is not a criticism of Notion. Every SaaS tool has the same constraint: it can only automate what happens inside itself. The gap is everything between your apps.

3. How accessibility API automation works with Notion

Notion is an Electron app on macOS. Like all Electron apps, it exposes its UI through the macOS accessibility framework. Every button, text field, menu item, and page element has an accessibility node with a role, a label, and a position.

This is the same tree that screen readers like VoiceOver use. It is also the tree that Fazm reads when automating Notion.

What Fazm sees when it looks at Notion

Instead of taking a screenshot and guessing where things are, Fazm reads the accessibility tree. For Notion, this looks something like:

[Window] "My Workspace - Notion"
  [Group] "Sidebar"
    [Button] "New page" x:24 y:120 w:180 h:32
    [StaticText] "Favorites" x:24 y:168
    [Link] "Project Tracker" x:24 y:196
    [Link] "Meeting Notes" x:24 y:224
  [Group] "Main content"
    [StaticText] "Sprint Planning" x:280 y:80
    [TextField] "Type something..." x:280 y:140 w:640 h:32
    [Table] "Tasks database"
      [Row] "Fix login bug" | "In Progress" | "High"
      [Row] "Update docs" | "Todo" | "Medium"

Every element has an exact label, role, and coordinate. Fazm's mcp-server-macos-use binary reads this tree and passes it to the AI model as structured text. The model then decides which element to interact with by name, not by guessing pixel locations from a screenshot.

This is why accessibility API automation is more reliable than screenshot-based approaches for tools like Notion that update their UI frequently. When Notion ships a redesigned sidebar (as they did in 3.4), the visual layout changes, but the accessibility labels stay consistent. A button labeled "New page" keeps that label regardless of where it moves on screen or what icon it uses.

Fazm uses this approach for native macOS apps (via the accessibility API) and a parallel approach for web apps (Playwright accessibility snapshots that produce a similar structured tree from the browser DOM). Both avoid the latency and error rate of full-image vision processing.

4. Cross-app workflows: Notion plus the rest of your Mac

The practical value of desktop-level automation is workflows that span multiple apps. Here are concrete examples using Notion's 2026 features as the starting point:

Dashboard data to a spreadsheet to an email

Notion's new dashboards show aggregated metrics across databases. But your finance team wants that data in Excel, not Notion. Fazm can read the dashboard values from the accessibility tree, open Numbers or Excel, populate a spreadsheet, and attach it to an outgoing email in Mail. One prompt, three apps, zero manual copying.

Meeting notes to calendar events to follow-up tasks

Notion's meeting notes now extract action items automatically. Fazm can take those action items, create calendar events in the macOS Calendar app with the right attendees and times, and open relevant documents in their native apps for review, all from the accessibility tree of each application.

Task status in Notion triggers terminal commands

When a task in your Notion database moves to "Ready for deploy," Fazm can read that status change, open Terminal, run your deployment script, capture the output, and paste the deploy URL back into the Notion task as a comment. Notion Workers can run sandboxed JavaScript, but they cannot touch your local dev environment.

Presentation mode content from multiple sources

Notion's presentation mode turns pages into slides. Fazm can help you build those slides by pulling data from other apps first: reading figures from a Numbers spreadsheet, grabbing a chart image from a browser tab, and inserting them into a Notion page before you enter presentation mode.

The pattern is always the same: read from one app's accessibility tree, act in another app's accessibility tree. No APIs to configure, no webhooks to set up, no OAuth tokens to manage. Fazm sees what is on screen the same way a person would, but reads the structured data underneath instead of the pixels on top.

5. Getting started

If you use Notion on a Mac and want to extend its 2026 features into cross-app workflows:

  1. Download Fazm from fazm.ai - it is free and open source.
  2. Grant accessibility permissions - macOS will prompt you to allow Fazm to read other apps' UI elements (System Settings > Privacy & Security > Accessibility).
  3. Open Notion - make sure the Notion desktop app (or browser tab) is running with the workspace you want to automate.
  4. Describe what you want in plain English - Fazm accepts natural language prompts. "Read my sprint tasks from Notion and create calendar blocks for each one" is a valid instruction.

Fazm routes each step to the right tool automatically: macOS accessibility APIs for native apps, Playwright for browser tabs, bash for terminal commands, and file operations for local documents. You do not need to specify which tool to use.

The 2026 Notion features are genuinely useful on their own. Dashboards, AI agents, and Workers make Notion more powerful inside its workspace. But the most time-consuming workflows are the ones that cross application boundaries, and that is where desktop-level automation fills the gap that no Notion update can.

Frequently asked questions

Can Fazm automate Notion without using the Notion API?

Yes. Fazm uses macOS accessibility APIs to read and interact with the Notion desktop app directly. It sees the same buttons, menus, and text fields that a screen reader would see, identified by role and label rather than pixel coordinates. This means it works with any Notion feature, including ones the API does not expose, like presentation mode, dashboards, or the sidebar.

Does Fazm still work when Notion updates its interface?

Accessibility API automation is more resilient to UI changes than screenshot-based tools. When Notion moves a button or changes its color, the underlying accessibility label usually stays the same (e.g., a button labeled 'New page' keeps that label regardless of where it appears on screen). Fazm reads these labels, not pixel positions, so minor layout changes rarely break workflows.

What Notion 2026 features can Fazm automate that Notion's own automations cannot?

Notion's built-in automations operate within Notion. Fazm operates at the OS level, so it can build workflows that span multiple apps. For example: read a task from a Notion database, open the relevant file in Finder, launch a terminal command, paste the output back into Notion, and send a summary via Mail. None of that is possible with Notion's internal automation or API alone.

How is accessibility API automation different from screenshot-based computer use?

Screenshot-based tools capture an image of your screen and send it to a vision model to guess what is on screen and where to click. This is slow (each screenshot is hundreds of kilobytes) and error-prone (the model may misidentify buttons or text). Accessibility API automation reads the actual UI element tree from macOS, getting exact element labels, types, and positions as structured text. The model receives precise data, not pixels.

Does Fazm work with the Notion web app or only the desktop app?

Fazm works with both. For the desktop app, it uses macOS accessibility APIs via the mcp-server-macos-use binary. For the web app in a browser, it uses Playwright-based browser automation with structured accessibility snapshots (not screenshots). Both approaches read UI elements by semantic role and label rather than coordinates.

Is Fazm free to use with Notion?

Fazm is free and open source. You can download it from fazm.ai and start automating Notion workflows immediately. It runs locally on your Mac, and your data never leaves your machine during automation.

Automate Notion across your entire Mac

Fazm reads Notion's UI through accessibility APIs, not screenshots. Free, open source, and works with any app on your Mac.

Try Fazm Free