Mobile App Development with AI Coding Agents: Native vs Cross-Platform
Mobile App Development with AI Coding Agents: Native vs Cross-Platform
Introduction
As AI coding agents become a primary force in software development, selecting a framework for mobile apps also needs a new evaluation axis. This article compares native development (Swift/Kotlin), Flutter, and React Native from the perspective of how accurately and autonomously AI agents can write code. In particular, it examines how hybrid implementations that sit at the boundary between cross-platform and native development affect AI-assisted work.
This article is based on information available as of March 2026. Sources are cited inline.
1. Assumptions
This article assumes that AI coding agents such as Claude Code, GitHub Copilot, Cursor, and Windsurf are used as the main driver of development. The assumption is not simple code completion, but agent-style usage where you describe requirements and the tool autonomously creates files, edits code, and runs tests.
Under that assumption, an important evaluation axis becomes how accurately and autonomously an AI agent can write the code in question. In real app development, cross-platform code alone often is not enough, and native implementations must be combined with it. The core question here is how that hybrid structure affects AI agents.
2. What Changed by 2026?
React Native: Consolidation Around the New Architecture
In React Native v0.76 (October 2024), the New Architecture (JSI/Fabric/TurboModules) became the default. In v0.82 (October 2025), opting out of the legacy architecture was disabled. Then in v0.84 (February 2026), Hermes V1 became the default JavaScript engine, and on iOS the legacy architecture code reached the point where it is excluded from builds by default.
“Starting from React Native 0.76, the New Architecture is enabled by default in your projects.”
— React Native Blog — React Native 0.76 (October 2024)
“we're confident in making it the only architecture for this and future versions of React Native”
— React Native Blog — React Native 0.82: A New Era (October 2025)
Flutter: The Arrival of an Official Dart MCP Server
Starting with Dart SDK 3.9 (August 2025, reference: Announcing Dart 3.9), an official Dart Model Context Protocol (MCP) server was added to the stable channel. It can integrate directly with Claude Code, Cursor, GitHub Copilot, Gemini CLI, and others. As of 2026, one of Flutter's strengths is that AI agents can autonomously handle widget tree analysis, analyzer runs, tests, pub.dev searches, and dependency management. That said, the official documentation explicitly says it is “experimental and likely to evolve quickly”, so teams should expect API and behavior changes in real-world use.
Integration with Claude Code can be configured with this single command:
claude mcp add --transport stdio dart -- dart mcp-server
Reference: Dart and Flutter MCP server — Flutter official docs (updated March 25, 2026)
React Native: The Rise of Expo UI and Agent Skills
There are also two important developments on the React Native side. One is Expo UI (Expo SDK 55, February 2026). It points toward a model where native components from SwiftUI and Jetpack Compose can be used from React Native, making the line between cross-platform and native development less rigid than before.
However, Expo UI is still evolving. Expo itself indicates that Jetpack Compose support and SwiftUI support differ in maturity, so production use requires checking stability and API-change risk case by case. At this point, it is more accurate to treat Expo UI not as a fully matured solution, but as a promising mechanism that advances native UI reuse.
The other development is Callstack's agent-skills (January 2026). Callstack released a skill set that allows AI agents such as Claude Code, Cursor, and Codex to directly reference best practices for React Native optimization. It is not as deeply integrated as an MCP server, but it does make it easier for agents to access knowledge about performance tuning, list implementation, and bundle size reduction.
References: Expo SDK 55 Changelog (February 2026)
References: Callstack — Announcing React Native Best Practices for AI Agents (January 2026)
Design System Shift: The Shock of Material 3 Expressive
Material 3 Expressive (M3E), announced at Google I/O in May 2025 (reference: Google Blog — Material 3 Expressive), is a major redesign of Android UI, with spring-based motion, a new shape library, and more emphatic typography. It shipped to Pixel devices in Android 16 QPR1 (September 2025), and Google's major apps started adopting M3E in sequence.
The important point is that framework support differs significantly.
- Jetpack Compose: M3E components are available through the
androidx.compose.material3library. You can use new components such as LoadingIndicator, SplitButton, ButtonGroup, and FloatingToolbar, as well as Expressive-specific APIs such as MotionScheme. - React Native (Expo UI): Because Jetpack Compose components can be used through Expo UI, there is room to follow M3E. However, Expo UI itself is still stabilizing, so it would be too strong to say React Native is already fully solid for M3E.
- Flutter: Material 3 Expressive is not supported as of March 2026. The Flutter team has stated, “Currently, we are not actively developing Material 3 Expressive,” and they are not accepting contributions either. The background is a structural reform to decouple Material and Cupertino libraries from the core framework. That separation started in Flutter 3.41 (February 2026), and M3E is expected to be provided later as a standalone package after decoupling, but the schedule is not fixed.
References: Flutter GitHub Issue #168813 — Bring Material 3 Expressive to Flutter
References: Android Developers — Material Design 3 in Compose
3. Factors That Determine AI-Agent Fit
Whether an AI coding agent can generate code accurately depends mainly on the following three factors.
- Amount of training data: React Native benefits from a massive JS/TS corpus. Flutter compensates with strong official documentation and MCP-based contextual understanding.
- Context consumption efficiency: Cross-platform development can stay within a single codebase, so the agent has less to keep in mind than with native development, which means two codebases.
- Toolchain integration: Flutter's official Dart MCP server is currently the most systematic solution for automating the analyzer → fix → test loop. On the React Native side, Callstack has also published agent-skills for Claude Code, Cursor, and similar tools, so the gap is narrowing.
“AI code suggestion quality is roughly equivalent for both frameworks”
— Flutter vs React Native 7 Tests (tech-insider.org, March 2026)
That said, the comparison here is used only as supplementary context. Third-party articles such as tech-insider.org and vibrant-info.com are useful reading, but the conclusions in this article rely mainly on official documentation, vendor-authored technical information, and public implementation policies.
As supplementary material, you can also look at the startup-focused comparison React Native vs Flutter: Which Wins for Startups in 2026? (vibrant-info.com) and React Native Wrapped 2025 (Callstack), which summarizes ecosystem trends across 2025.
4. Native vs Cross-Platform Comparison
Comparing AI-Agent Fit
| Evaluation Axis | Native (Swift / Kotlin) | Flutter | React Native (New Arch) |
|---|---|---|---|
| Amount of AI training data | Rich, but split across OSs ⚠️ | Context understanding reinforced by MCP ⚠️ | Largest JS/TS corpus ✅ |
| Context consumption | Managing two codebases ❌ | One codebase ✅ | One codebase ✅ |
| MCP agent integration | Xcode integration is limited ❌ | Deep integration through the official Dart MCP server ✅ | Covered through Callstack agent-skills and similar efforts ⚠️ |
| Explicitness of code patterns | A lot of implicit knowledge ⚠️ | Clear with declarative UI ✅ | More predictable with the New Architecture ✅ |
| Complexity of native integration | Direct calls possible ✅ | Via Platform Channel / FFI ⚠️ | Via TurboModules + Codegen ⚠️ |
Comparing UI Implementation Characteristics
| Perspective | Swift (iOS) | Kotlin (Android) | Flutter | React Native |
|---|---|---|---|---|
| Platform alignment | Easy to stay close to HIG ✅ | Official implementation available for Material 3 Expressive ✅ | Requires extra work because of its own renderer ⚠️ | Easier to use SwiftUI/Jetpack Compose through Expo UI ⚠️ |
| Consistency across iOS and Android | iOS only | Android only | Identical across both OSs ✅ | Platform differences tend to surface ⚠️ |
| Freedom for custom design | Reject risk if you diverge from HIG ⚠️ | Rich within the M3 model ✅ | Fully flexible through its own engine ✅ | Can be extended with Skia ✅ |
| Material 3 Expressive support | — | M3E components available ✅ | Unsupported for now, expected after decoupling ❌ | Potential path via Expo UI ⚠️ |
| Animation quality | Core Animation with 120 Hz support ✅ | Compose Animation ✅ | Stable 60/120 fps with Impeller ✅ | GPU-driven with Reanimated 3 + Skia ✅ |
| Haptics and OS-integrated UI | Tight integration with Taptic Engine ✅ | Vibration API and similar features vary by device ⚠️ | Limited via plugins ⚠️ | Plugin-based, improved by the New Architecture ⚠️ |
| AI-generated UI accuracy | Strong for SwiftUI, more implicit knowledge in UIKit ⚠️ | Jetpack Compose still shows variance with newer APIs ⚠️ | Declarative widgets work well with AI ✅ | JSX benefits from transferable web knowledge ✅ |
5. The Reality of Hybrid Implementations and Their Impact on AI Agents
Even if you choose a cross-platform framework, real product development creates boundaries where native implementation is still required. Features such as camera access, Bluetooth, biometric authentication, push notifications, AR, and HealthKit often require direct access to OS APIs, which means calling native code through plugins, Platform Channels, or TurboModules.
This boundary is the hardest point for AI agents. When an implementation spans the Dart or TypeScript world and the Swift or Kotlin world, context gets distributed across multiple languages and multiple files, making generation errors more likely.
Flutter Hybrid Implementations: Platform Channel and FFI
Flutter has two main ways to call native code.
- Platform Channel: A mechanism for passing messages between Dart and Swift/Kotlin. Its structure is clear, but you must keep three files in sync: the Dart side, the iOS side, and the Android side. AI can easily update one side while leaving type definitions mismatched on the others.
- Dart FFI: A mechanism for calling C/C++ libraries directly. It lets you get closer to high-performance OS-level APIs, but it requires pointer and type-mapping knowledge, so AI generation accuracy drops further.
Reference: Flutter official docs — Developing packages & plugins (updated January 28, 2026)
React Native Hybrid Implementations: TurboModules + Codegen
From React Native v0.76 onward, TurboModules are used for native modules. Because Codegen can generate iOS (Swift/Obj-C) and Android (Kotlin) boilerplate from a TypeScript spec definition file, the TypeScript spec can more naturally serve as a Single Source of Truth.
// NativeMyModule.ts — Example Spec file
import type { TurboModule } from 'react-native';
import { TurboModuleRegistry } from 'react-native';
export interface Spec extends TurboModule {
getDeviceName(): string;
requestCameraPermission(): Promise<boolean>;
}
export default TurboModuleRegistry.getEnforcing<Spec>('MyModule');
Codegen uses this spec to generate boilerplate for both iOS and Android. That makes it easier for AI agents to focus on the TypeScript layer while leaving repetitive native boilerplate to generation.
References: React Native official docs — Native Modules: Introduction
References: Callstack — Announcing React Native Best Practices for AI Agents (January 2026)
Feature-by-Feature View: Hybrid Difficulty and AI Readiness
| Feature | Flutter (difficulty for AI) | React Native (difficulty for AI) |
|---|---|---|
| Camera / photos | Many mature plugins on pub.dev. Easy for AI to generate ✅ | react-native-vision-camera and similar libraries are mature and support the New Architecture ✅ |
| Push notifications | Standard options such as firebase_messaging ✅ | Expo Notifications is mature ✅ |
| Biometric auth (Face ID, etc.) | Supported with local_auth ✅ | Supported with react-native-biometrics and similar libraries ✅ |
| Bluetooth / BLE | Specifications are complex, and AI often mishandles permissions ⚠️ | Some modules still lack complete TurboModule support ⚠️ |
| HealthKit / Health Connect | AI tends to confuse iOS and Android differences ⚠️ | Large platform differences create room for AI misimplementation ⚠️ |
| Custom native modules (proprietary implementation) | Spans Dart + Swift + Kotlin. AI often makes consistency mistakes ❌ | Codegen from a TypeScript spec helps narrow and structure the work area ⚠️ |
| ARKit / ARCore | Complex native implementation remains. Autonomous AI implementation is difficult ❌ | Same issue. Without native knowledge, even review is hard ❌ |
Principles for Using AI Agents in Hybrid Implementations
“Native plugins and heavy custom logic still need human oversight.”
— Stitch + Antigravity + Flutter (dev.to, March 2026)
6. What Should You Choose?
With the reality of hybrid implementation in mind, here are scenario-based recommendations.
Choose Flutter when native needs are limited to what mature plugins can handle
If your requirements are covered by standard plugins for camera access, notifications, authentication, and similar features, agent integration through Dart + the MCP server is highly efficient. However, if the Android side must comply with Material 3 Expressive, Flutter still lacks support and that needs to be taken seriously.
Choose React Native when you need custom native modules or M3E compliance
TurboModules + Codegen, built around a TypeScript spec, make it easier to define the AI agent's work area clearly. That tends to be an advantage in hybrid implementations. Expo UI may also provide a path to Jetpack Compose M3E components, but any adoption decision should be preceded by a maturity check.
Choose native when native implementation makes up most of the product
If most of the work is native, as with ARKit, CoreML, or custom BLE control, the value of a cross-platform common layer becomes smaller. In that case, writing the app natively from the start lets the AI keep its context within one language.
Hybrid strategy: Flutter for the UI layer, modularized native code for heavy platform work
Another option is to build most of the app in Flutter while clearly isolating only the parts that require native processing into modules. By letting AI agents focus only on the Flutter layer, you can keep generation quality more stable.
7. Conclusion
The best choice depends less on “which framework you choose” and more on “what proportion and type of native implementation your product requires.”
If your native requirements stay within the range covered by mature plugins, Flutter is highly efficient, especially when combined with MCP server integration. If you need custom native modules, React Native's TypeScript-spec-centered structure is easier for AI to work with. As the share of native implementation increases, the benefits of cross-platform development shrink, and the answer moves closer to pure native development.
In every case, the precision of AI-assisted development depends heavily on designing the boundary between native and cross-platform code before handing the work to the agent.
Also, if your Android UI must comply with Material 3 Expressive, React Native + Expo UI (via Jetpack Compose) or native development is more realistic than Flutter, which remains unsupported as of March 2026. That said, Expo UI on the React Native side still requires ongoing maturity checks. The Flutter team is decoupling its design libraries, and M3E is expected to arrive later as a standalone package after that work completes, but the timeline remains uncertain.
References
- React Native 0.76 release blog — New Architecture enabled by default
- React Native 0.82 release blog — Opt-out from the legacy architecture disabled
- React Native 0.83 release blog — Zero breaking changes and introduction of the legacy-architecture removal flag
- React Native 0.84 release blog — Hermes V1 enabled by default and default exclusion of legacy architecture code
- Dart and Flutter MCP server — Flutter official documentation (updated March 25, 2026)
- Announcing Dart 3.9 — Stable-channel addition of the official Dart MCP server
- Flutter — Developing packages & plugins — Official explanation of Platform Channels and FFI (updated January 28, 2026)
- React Native — Native Modules: Introduction — Official TurboModules documentation
- Callstack — React Native Best Practices for AI Agents (January 2026)
- Flutter vs React Native 7 Tests, 1 Clear Winner (tech-insider.org, March 2026)
- React Native vs Flutter: Which Wins for Startups in 2026? (vibrant-info.com, March 2026)
- Expo SDK 55 Changelog — Overview of Expo UI and SDK 55 changes
- Expo Blog — Expo UI in SDK 55 — Positioning and constraints of Expo UI
- React Native Wrapped 2025 (Callstack, February 2026) — A year-in-review summary of the React Native ecosystem
- Google Blog — Material 3 Expressive (May 2025) — M3E announcement
- Flutter GitHub Issue #168813 — Bring Material 3 Expressive to Flutter — Flutter's direction on M3E support
- Android Developers — Material Design 3 in Compose — M3E implementation in Jetpack Compose