Notion Announcements April 2026: What Shipped, What the API Missed, and How to Automate It All

M
Matthew Diakonov
6 min read

Notion released six updates in the first ten days of April 2026. Voice input on desktop, shareable AI chats, mutable threads, new cover art, and more. Most roundups list these features and move on. This guide does something different: it shows that 6 of 8 new features have zero API support, explains why that matters for anyone building automations, and walks through how macOS accessibility APIs let you automate what the Notion API cannot reach.

4.9from 500+ Mac users
Free & open source
Works offline
No API keys

1. Every April 2026 announcement at a glance

Below is every Notion announcement from April 1 through April 10, 2026, plus the late-March features that reached most users in early April. Dates come from Notion's official release notes.

Apr 10

Customize database tab display

Show tabs as text and icon, text only, or icon only. Per-user setting, does not change teammates' views.

Apr 9

Museum-sourced page cover art

Four new collections: natural textures from Texturelabs, Hudson River School paintings from the Met, Chinese silk paintings from the National Museum of Asian Art, and USDA Pomological Watercolors.

Apr 8

Mute discussion replies

Silence notifications for specific threads. Auto-unmutes if someone @-mentions you or you post a new comment.

Apr 7

Shareable AI chat links

Share Notion AI conversations as read-only links. Manage shared links under Settings, Public pages, Shared AI chats. Same day: the Create Comment API now accepts markdown bodies (SDK v5.17.0).

Apr 6

Voice input on desktop

Dictate prompts to Notion AI using your microphone on macOS and Windows. Custom AI Agents available free through May 3, 2026.

Apr 1

Desktop app v7.9.1 and v7.10.0

Bug fixes for tabs and meeting notes. v7.10.0 (Apr 6) added AI Meeting Notes in Command Search and an Electron 40.8.5 upgrade.

Late March (rolling out in April)

Mar 30

Tabs block (/tabs command)

Organize page content into tabbed sections. New block type that reduces vertical scrolling on long pages.

Mar 26

Notion 3.4: dashboards, presentation mode, new sidebar

Dashboards for databases (Business and Enterprise), redesigned sidebar with four customizable tabs, presentation mode for pages, H4 headers, and AI image generation.

2. The API gap: why 75% of new features are unreachable

Here is the part that every other Notion roundup skips. When you look at the April announcements through the lens of automation, a pattern emerges: Notion is shipping user-facing features much faster than it is adding API endpoints.

FeatureAPI supportWhat that means
Voice inputNoneCannot trigger, control, or read transcription results via API
Shareable AI chatsNoneNo endpoint to share, list, or revoke AI chat links
Markdown comment bodiesYesCreate Comment endpoint, SDK v5.17.0+
Mute discussionsNoneNo way to mute or unmute threads programmatically
Page cover art galleryPartialCan set a cover URL via API, but cannot browse the new gallery collections
Database tab displayNoneTab display preferences are UI-only, per-user settings
Tabs block (/tabs)NoneBlock type not exposed in the API at all
Presentation modeNoneNo API to start, navigate, or exit presentations

The score: 1 feature with full API support, 1 with partial support, 6 with none. If you build Notion automations using only the API, three quarters of April's announcements are invisible to you.

3. How accessibility APIs bridge the gap

Notion on macOS is an Electron app. Like every Electron app, it exposes its entire UI through the macOS accessibility framework. Every button, text field, menu, tab, and dialog has a node in the accessibility tree with a role (what it is), a label (what it says), and a position (where it is on screen).

Fazm includes a component called mcp-server-macos-use that reads this tree and interacts with elements by identity. It does not take screenshots and guess where to click. It queries macOS for the element labeled "Voice input," gets its exact coordinates, and clicks it. When Notion redesigns a screen, the visual layout changes but accessibility labels typically stay the same, so automations keep working.

Here is what the accessibility tree looks like for Notion's new voice input feature:

[Group] "AI prompt"
  [Button] "Voice input" x:612 y:480 w:32 h:32
  [TextField] "Ask AI anything..." x:280 y:476 w:320 h:40
  [Button] "Submit" x:648 y:480 w:32 h:32

Each line is structured data: role in brackets, label in quotes, then exact screen coordinates and dimensions. Fazm reads this tree, finds the target element, and sends a click event to macOS at the element's center point. No vision model involved, no pixel guessing, no prompt engineering to interpret a screenshot.

This is what makes it possible to automate features that Notion has not added to its API. The accessibility tree is always current because macOS generates it from the live UI. If Notion adds a new button tomorrow, it appears in the tree immediately with its label and position.

How Fazm reads the accessibility tree

Under the hood, Fazm uses the macOS AXUIElement API. The app calls AXIsProcessTrusted() on launch to verify it has Accessibility permission, then queries the focused application's element hierarchy. Each element returns its role (AXRole), title (AXTitle), position (AXPosition), and size (AXSize) as structured attributes. This is the same interface that screen readers like VoiceOver use, which means it is well-maintained and stable across macOS versions.

4. Workflows that use the April announcements

The real value of desktop-level automation is combining Notion's new features with the rest of your Mac. Here are workflows built around April 2026 announcements that are impossible with the Notion API alone.

Voice standup to Slack

Open Notion, click the voice input button via the accessibility tree, dictate your standup update, read the transcription, then paste it into your team's Slack channel. Fazm handles every step across both apps without manual switching.

Auto-share AI research

Run a research prompt in Notion AI, click the new share button to generate a read-only link, copy it, and post it to a GitHub issue, Linear ticket, or email thread. The entire chain runs through accessibility elements that have no API equivalent.

Batch-organize pages with tabs

For a collection of long Notion pages, open each one, insert a /tabs block, and move content sections into separate tabs. Doing this manually across dozens of pages takes hours. Through the accessibility tree, every element is addressable by label and the entire operation is scriptable.

Presentation slides to PDF

Enter presentation mode (Cmd+Opt+P), advance through each slide using the accessible navigation controls, capture a screenshot per slide, then combine them into a PDF using Preview. Notion offers no export option for presentation mode.

Each workflow combines features with no API with cross-app actions that no single tool handles alone. Fazm connects UI-only features to the rest of your desktop through structured accessibility automation, not fragile screenshot matching.

Frequently asked questions

What are the major Notion announcements from April 2026?

Notion shipped six updates between April 1 and April 10, 2026: voice input for AI on desktop (Apr 6), shareable AI chat links (Apr 7), mutable discussion threads (Apr 8), museum-sourced page cover art collections (Apr 9), customizable database tab display (Apr 10), and desktop app v7.9.1 and v7.10.0 with bug fixes and AI Meeting Notes in Command Search.

How many of Notion's April 2026 features have API support?

Out of 8 features (including late-March rollouts), only 1 has full API support: markdown comment bodies via the Create Comment endpoint in SDK v5.17.0. One feature (page cover art) has partial API access for setting cover URLs, but not for browsing the new gallery. The remaining 6 features are UI-only with no API endpoints.

Can I automate Notion features that have no API?

Yes. Notion is an Electron app on macOS, so every UI element (buttons, tabs, text fields, menus) is exposed through the macOS accessibility framework. Tools like Fazm use the AXUIElement API to read these elements as structured data with exact roles, labels, and coordinates, then interact with them by identity rather than by pixel position.

What is the difference between accessibility API automation and screenshot-based automation?

Screenshot-based tools capture a full image and use a vision model to guess where to click. Accessibility API automation reads the actual element tree from macOS, getting structured data like [Button] 'Voice input' x:612 y:480. The accessibility approach is deterministic: it finds elements by label and role, so it keeps working when Notion updates its visual layout.

Does Fazm work with Notion's new voice input feature?

Yes. The voice input button appears in Notion AI's prompt bar as an accessible element. Fazm can locate it by its accessibility label, click it to start dictation, wait for transcription, and then submit the prompt. This enables hands-free workflows that chain voice input with actions in other Mac apps.

Can Fazm automate workflows across Notion and other Mac apps?

That is the primary use case. Because Fazm operates at the macOS level, it can read data from a Notion database, open a related file in Finder, run a terminal command, paste output back into Notion, and send a summary via Slack or Mail. Notion's internal automations and API only operate within Notion itself.

Automate Notion's newest features across your Mac

Fazm is free and open source. It uses macOS accessibility APIs to reach what the Notion API cannot. Download it and build workflows that span every app on your desktop.

Try Fazm free