android-device-automation
๐ฏSkillfrom web-infra-dev/midscene-skills
Vision-driven cross-platform UI automation skills powered by Midscene.js, supporting browser, desktop, Android, and iOS through screenshot-based control without DOM or accessibility labels.
Same repository
web-infra-dev/midscene-skills(8 items)
Installation
npx vibeindex add web-infra-dev/midscene-skills --skill android-device-automationnpx skills add web-infra-dev/midscene-skills --skill android-device-automation~/.claude/skills/android-device-automation/SKILL.mdSKILL.md
More from this repository7
Vision-driven cross-platform UI automation skills built on Midscene.js. Supports browser, Chrome Bridge, desktop (macOS/Windows/Linux), Android (ADB), and iOS (WebDriverAgent) automation using natural-language commands and screenshot-based recognition.
Vision-driven cross-platform UI automation skills powered by Midscene.js, supporting browser, desktop, Android, and iOS through screenshot-based control without DOM or accessibility labels.
Vision-driven cross-platform UI automation skills powered by Midscene.js, supporting browser, desktop, Android, and iOS through screenshot-based control without DOM or accessibility labels.
A vision-driven cross-platform UI automation skill built on Midscene.js that uses natural-language commands and screenshots to control browsers, desktops, Android, iOS, and HarmonyOS devices, with support for E2E test scaffolding.
Vision-driven cross-platform UI automation skills powered by Midscene.js, supporting browser, desktop, Android, and iOS through screenshot-based control without DOM or accessibility labels.
Vision-driven cross-platform UI automation skills built on Midscene.js. Supports natural-language driven control for browser, desktop (macOS/Windows/Linux), Android, iOS, and HarmonyOS platforms, operating entirely from screenshots for reliable cross-platform automation.
Skill