automation-script-generator
ð¯Skillfrom ntaksh42/agents
Generates automation scripts in multiple languages to streamline repetitive tasks like file management, data processing, API integration, and system monitoring.
Part of
ntaksh42/agents(78 items)
Installation
npx skills add ntaksh42/agents --skill automation-script-generatorSkill Details
ç¹°ãè¿ãäœæ¥ã®èªååã¹ã¯ãªããçæã¹ãã«ãShellãPythonãPowerShellãNode.jsã¹ã¯ãªãããçæããã¡ã€ã«æäœãããŒã¿åŠçãAPI飿ºãCI/CDãããã¯ã¢ãããç£èŠãã¬ããŒãçæãèªååã
Overview
# Automation Script Generator Skill
ç¹°ãè¿ãäœæ¥ãèªååããã¹ã¯ãªãããçæããã¹ãã«ã§ãã
æŠèŠ
ãã®ã¹ãã«ã¯ãæ¥åžžçãªç¹°ãè¿ãäœæ¥ãèªååããã¹ã¯ãªãããçæããŸãããã¡ã€ã«æäœãããŒã¿åŠçãAPI飿ºãããã¯ã¢ãããç£èŠãã¬ããŒãçæãªã©ãããããå®åäœæ¥ãå¹çåããŸããè€æ°ã®ã¹ã¯ãªããèšèªã«å¯Ÿå¿ãããšã©ãŒãã³ããªã³ã°ããã°åºåãã¹ã±ãžã¥ãŒã«å®è¡ã«ã察å¿ããŸãã
äž»ãªæ©èœ
- ãã«ãèšèªå¯Ÿå¿: Bash, Python, PowerShell, Node.js, Ruby
- ãã¡ã€ã«æäœ: äžæ¬ãªããŒã ãã³ããŒãç§»åãå§çž®ãåé€
- ããŒã¿åŠç: CSV/JSON倿ããã£ã«ã¿ãªã³ã°ãéèšãããŒãž
- API飿º: REST APIåŒã³åºããèªèšŒããšã©ãŒãã³ããªã³ã°
- ããã¯ã¢ããèªåå: ãã¡ã€ã«ãDBãã¯ã©ãŠãã¹ãã¬ãŒãž
- ç£èŠã»ã¢ã©ãŒã: ãªãœãŒã¹ç£èŠããã«ã¹ãã§ãã¯ãéç¥
- ã¬ããŒãçæ: ãã°åæãã¡ããªã¯ã¹éèšãHTML/PDFã¬ããŒã
- CI/CDã¹ã¯ãªãã: ãã«ãããã¹ãããããã€ã®èªåå
- ã¹ã±ãžã¥ãŒã«å®è¡: cron, ã¿ã¹ã¯ã¹ã±ãžã¥ãŒã©èšå®
- ãšã©ãŒãã³ããªã³ã°: å ç¢ãªãšã©ãŒåŠçãšãªãã©ã€æ©èœ
ã¹ã¯ãªããã¿ã€ã
1. ãã¡ã€ã«æäœ
#### äžæ¬ãªããŒã ïŒBashïŒ
```bash
#!/bin/bash
# ç»åãã¡ã€ã«ãæ¥ä»é ã«ãªããŒã
# 䜿çšäŸ: ./rename_images.sh /path/to/images
set -euo pipefail
SOURCE_DIR="${1:-.}"
PREFIX="photo"
EXTENSION="jpg"
# ã«ãŠã³ã¿ãŒåæå
counter=1
# ãã¡ã€ã«ãæŽæ°æ¥æé ã§ãœãŒã
find "$SOURCE_DIR" -type f -name "*.$EXTENSION" -print0 | \
sort -z | \
while IFS= read -r -d '' file; do
# æ°ãããã¡ã€ã«åãçæïŒãŒãããã£ã³ã°ïŒ
new_name=$(printf "%s_%04d.%s" "$PREFIX" "$counter" "$EXTENSION")
new_path="$SOURCE_DIR/$new_name"
# ãªããŒã å®è¡
if [ "$file" != "$new_path" ]; then
mv -v "$file" "$new_path"
echo "Renamed: $(basename "$file") -> $new_name"
fi
((counter++))
done
echo "â ãªããŒã å®äº: $((counter - 1)) ãã¡ã€ã«"
```
#### ãã¡ã€ã«æŽçïŒPythonïŒ
```python
#!/usr/bin/env python3
"""
ããŠã³ããŒããã©ã«ããæ¡åŒµåå¥ã«æŽç
䜿çšäŸ: python organize_files.py ~/Downloads
"""
import os
import shutil
from pathlib import Path
from datetime import datetime
import logging
# ãã°èšå®
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
)
# æ¡åŒµåãšãã©ã«ãã®ãããã³ã°
EXTENSION_MAPPING = {
'Images': ['.jpg', '.jpeg', '.png', '.gif', '.bmp', '.svg'],
'Documents': ['.pdf', '.doc', '.docx', '.txt', '.xlsx', '.pptx'],
'Videos': ['.mp4', '.avi', '.mov', '.mkv', '.flv'],
'Audio': ['.mp3', '.wav', '.flac', '.m4a'],
'Archives': ['.zip', '.rar', '.7z', '.tar', '.gz'],
'Code': ['.py', '.js', '.java', '.cpp', '.html', '.css'],
}
def organize_files(source_dir: str, dry_run: bool = False):
"""ãã¡ã€ã«ãæ¡åŒµåå¥ã«æŽç"""
source_path = Path(source_dir)
if not source_path.exists():
logging.error(f"ãã£ã¬ã¯ããªãååšããŸãã: {source_dir}")
return
files_moved = 0
for file_path in source_path.iterdir():
# ãã£ã¬ã¯ããªã¯ã¹ããã
if file_path.is_dir():
continue
# æ¡åŒµåãååŸ
extension = file_path.suffix.lower()
# 察å¿ãããã©ã«ããç¹å®
target_folder = None
for folder, extensions in EXTENSION_MAPPING.items():
if extension in extensions:
target_folder = folder
break
# ãããã³ã°ã«ãªãæ¡åŒµå㯠"Others" ã«
if target_folder is None:
target_folder = "Others"
# ç§»åå ãã£ã¬ã¯ããªãäœæ
dest_dir = source_path / target_folder
if not dry_run:
dest_dir.mkdir(exist_ok=True)
# ãã¡ã€ã«ãç§»å
dest_path = dest_dir / file_path.name
# ååãã¡ã€ã«ãååšããå Žåãã¿ã€ã ã¹ã¿ã³ãã远å
if dest_path.exists():
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
stem = dest_path.stem
suffix = dest_path.suffix
dest_path = dest_dir / f"{stem}_{timestamp}{suffix}"
if dry_run:
logging.info(f"[DRY RUN] {file_path.name} -> {target_folder}/")
else:
shutil.move(str(file_path), str(dest_path))
logging.info(f"ç§»å: {file_path.name} -> {target_folder}/")
files_moved += 1
logging.info(f"â å®äº: {files_moved} ãã¡ã€ã«ãæŽçããŸãã")
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print("äœ¿çšæ³: python organize_files.py <ãã£ã¬ã¯ããªãã¹> [--dry-run]")
sys.exit(1)
source_dir = sys.argv[1]
dry_run = "--dry-run" in sys.argv
organize_files(source_dir, dry_run)
```
2. ããŒã¿åŠç
#### CSV to JSON倿ïŒNode.jsïŒ
```javascript
#!/usr/bin/env node
/**
* CSVãã¡ã€ã«ãJSONã«å€æ
* 䜿çšäŸ: node csv_to_json.js input.csv output.json
*/
const fs = require('fs');
const csv = require('csv-parser');
function csvToJson(inputFile, outputFile) {
const results = [];
fs.createReadStream(inputFile)
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
// JSONãã¡ã€ã«ã«æžã蟌ã¿
fs.writeFileSync(
outputFile,
JSON.stringify(results, null, 2),
'utf-8'
);
console.log(â 倿å®äº: ${results.length} ã¬ã³ãŒã);
console.log(åºå: ${outputFile});
})
.on('error', (error) => {
console.error('ãšã©ãŒ:', error.message);
process.exit(1);
});
}
// ã³ãã³ãã©ã€ã³åŒæ°
const [inputFile, outputFile] = process.argv.slice(2);
if (!inputFile || !outputFile) {
console.error('äœ¿çšæ³: node csv_to_json.js
process.exit(1);
}
csvToJson(inputFile, outputFile);
```
#### ããŒã¿éèšïŒPythonïŒ
```python
#!/usr/bin/env python3
"""
ãã°ãã¡ã€ã«ããçµ±èšæ å ±ãéèš
䜿çšäŸ: python analyze_logs.py access.log
"""
import re
from collections import Counter, defaultdict
from datetime import datetime
import json
def analyze_access_log(log_file: str):
"""ã¢ã¯ã»ã¹ãã°ãåæ"""
# Apache/Nginx圢åŒã®ãã°ãã¿ãŒã³
log_pattern = re.compile(
r'(?P
r'"(?P
r'(?P
)
stats = {
'total_requests': 0,
'status_codes': Counter(),
'methods': Counter(),
'paths': Counter(),
'ips': Counter(),
'hourly_distribution': defaultdict(int),
}
with open(log_file, 'r') as f:
for line in f:
match = log_pattern.match(line)
if not match:
continue
data = match.groupdict()
stats['total_requests'] += 1
stats['status_codes'][data['status']] += 1
stats['methods'][data['method']] += 1
stats['paths'][data['path']] += 1
stats['ips'][data['ip']] += 1
# æé垯å¥ã®éèš
try:
dt = datetime.strptime(data['datetime'], '%d/%b/%Y:%H:%M:%S %z')
hour = dt.hour
stats['hourly_distribution'][hour] += 1
except ValueError:
pass
# ã¬ããŒãçæ
print("=" * 60)
print("ã¢ã¯ã»ã¹ãã°åæã¬ããŒã")
print("=" * 60)
print(f"\nç·ãªã¯ãšã¹ãæ°: {stats['total_requests']:,}")
print("\n--- ã¹ããŒã¿ã¹ã³ãŒãå¥ ---")
for status, count in stats['status_codes'].most_common():
print(f"{status}: {count:,}")
print("\n--- HTTPã¡ãœããå¥ ---")
for method, count in stats['methods'].most_common():
print(f"{method}: {count:,}")
print("\n--- ããã10 ãã¹ ---")
for path, count in stats['paths'].most_common(10):
print(f"{count:,} - {path}")
print("\n--- ããã10 IPã¢ãã¬ã¹ ---")
for ip, count in stats['ips'].most_common(10):
print(f"{count:,} - {ip}")
print("\n--- æé垯å¥ååž ---")
for hour in range(24):
count = stats['hourly_distribution'][hour]
bar = 'â' * (count // 100)
print(f"{hour:02d}:00 | {bar} {count:,}")
# JSONåºå
output_file = log_file.replace('.log', '_stats.json')
with open(output_file, 'w') as f:
# Counterãdictã«å€æ
stats_dict = {
'total_requests': stats['total_requests'],
'status_codes': dict(stats['status_codes']),
'methods': dict(stats['methods']),
'top_paths': dict(stats['paths'].most_common(20)),
'top_ips': dict(stats['ips'].most_common(20)),
'hourly_distribution': dict(stats['hourly_distribution']),
}
json.dump(stats_dict, f, indent=2)
print(f"\nâ 詳现統èšãä¿å: {output_file}")
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print("äœ¿çšæ³: python analyze_logs.py
sys.exit(1)
analyze_access_log(sys.argv[1])
```
3. API飿º
#### REST APIèªååïŒPythonïŒ
```python
#!/usr/bin/env python3
"""
GitHub APIçµç±ã§ãªããžããªæ å ±ãååŸ
䜿çšäŸ: python github_stats.py
ç°å¢å€æ°: GITHUB_TOKEN
"""
import os
import requests
from datetime import datetime
import json
GITHUB_API_BASE = "https://api.github.com"
def get_user_repos(username: str, token: str = None):
"""ãŠãŒã¶ãŒã®ãªããžããªäžèЧãååŸ"""
headers = {}
if token:
headers['Authorization'] = f'token {token}'
url = f"{GITHUB_API_BASE}/users/{username}/repos"
params = {'per_page': 100, 'sort': 'updated'}
try:
response = requests.get(url, headers=headers, params=params)
response.raise_for_status()
return response.json()
except requests.exceptions.RequestException as e:
print(f"ãšã©ãŒ: {e}")
return []
def generate_report(username: str, repos: list):
"""ã¬ããŒãçæ"""
if not repos:
print("ãªããžããªãèŠã€ãããŸããã§ãã")
return
# çµ±èšèšç®
total_stars = sum(repo['stargazers_count'] for repo in repos)
total_forks = sum(repo['forks_count'] for repo in repos)
languages = {}
for repo in repos:
lang = repo.get('language')
if lang:
languages[lang] = languages.get(lang, 0) + 1
# ã¬ããŒãåºå
print("=" * 60)
print(f"GitHub ãªããžããªçµ±èš: {username}")
print("=" * 60)
print(f"\nç·ãªããžããªæ°: {len(repos)}")
print(f"ç·ã¹ã¿ãŒæ°: {total_stars:,}")
print(f"ç·ãã©ãŒã¯æ°: {total_forks:,}")
print("\n--- 䜿çšèšèª ---")
for lang, count in sorted(languages.items(), key=lambda x: x[1], reverse=True):
print(f"{lang}: {count}")
print("\n--- ããã10 ã¹ã¿ãŒæ° ---")
top_repos = sorted(repos, key=lambda x: x['stargazers_count'], reverse=True)[:10]
for repo in top_repos:
print(f"{repo['stargazers_count']:,} â - {repo['name']}")
print(f" {repo['html_url']}")
# JSONåºå
output_file = f"{username}_github_stats.json"
with open(output_file, 'w') as f:
json.dump({
'username': username,
'total_repos': len(repos),
'total_stars': total_stars,
'total_forks': total_forks,
'languages': languages,
'top_repos': [
{
'name': repo['name'],
'stars': repo['stargazers_count'],
'forks': repo['forks_count'],
'language': repo.get('language'),
'url': repo['html_url'],
}
for repo in top_repos
],
'generated_at': datetime.now().isoformat(),
}, f, indent=2)
print(f"\nâ ã¬ããŒããä¿å: {output_file}")
if __name__ == "__main__":
import sys
if len(sys.argv) < 2:
print("äœ¿çšæ³: python github_stats.py
sys.exit(1)
username = sys.argv[1]
token = os.environ.get('GITHUB_TOKEN')
repos = get_user_repos(username, token)
generate_report(username, repos)
```
4. ããã¯ã¢ããèªåå
#### ããŒã¿ããŒã¹ããã¯ã¢ããïŒBashïŒ
```bash
#!/bin/bash
# PostgreSQLããŒã¿ããŒã¹ã®èªåããã¯ã¢ãã
# 䜿çšäŸ: ./backup_postgres.sh
set -euo pipefail
# èšå®
DB_NAME="mydb"
DB_USER="postgres"
BACKUP_DIR="/var/backups/postgres"
RETENTION_DAYS=7
# ã¿ã€ã ã¹ã¿ã³ã
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/${DB_NAME}_${TIMESTAMP}.sql.gz"
# ãã°é¢æ°
log() {
echo "[$(date +'%Y-%m-%d %H:%M:%S')] $1"
}
# ããã¯ã¢ãããã£ã¬ã¯ããªäœæ
mkdir -p "$BACKUP_DIR"
# ããã¯ã¢ããå®è¡
log "ããã¯ã¢ããéå§: $DB_NAME"
pg_dump -U "$DB_USER" "$DB_NAME" | gzip > "$BACKUP_FILE"
if [ $? -eq 0 ]; then
log "â ããã¯ã¢ããæå: $BACKUP_FILE"
# ãã¡ã€ã«ãµã€ãºè¡šç€º
SIZE=$(du -h "$BACKUP_FILE" | cut -f1)
log "ãã¡ã€ã«ãµã€ãº: $SIZE"
else
log "â ããã¯ã¢ãã倱æ"
exit 1
fi
# å€ãããã¯ã¢ãããåé€ïŒä¿ææéãéãããã®ïŒ
log "å€ãããã¯ã¢ãããåé€ïŒ${RETENTION_DAYS}æ¥ä»¥åïŒ"
find "$BACKUP_DIR" -name "${DB_NAME}_*.sql.gz" -type f -mtime +$RETENTION_DAYS -delete
# çŸåšã®ããã¯ã¢ããäžèЧ
log "çŸåšã®ããã¯ã¢ãã:"
ls -lh "$BACKUP_DIR/${DB_NAME}_"*.sql.gz
log "ããã¯ã¢ããåŠçå®äº"
```
#### ã¯ã©ãŠãã¹ãã¬ãŒãžåæïŒPythonïŒ
```python
#!/usr/bin/env python3
"""
ããŒã«ã«ãã¡ã€ã«ãS3ã«ããã¯ã¢ãã
䜿çšäŸ: python backup_to_s3.py /path/to/local s3://bucket-name/prefix
èŠä»¶: pip install boto3
"""
import os
import boto3
from pathlib import Path
import logging
from datetime import datetime
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
)
def backup_to_s3(local_path: str, s3_uri: str):
"""ããŒã«ã«ãã£ã¬ã¯ããªãS3ã«ããã¯ã¢ãã"""
# S3ã¯ã©ã€ã¢ã³ã
s3 = boto3.client('s3')
# S3 URIè§£æ
if not s3_uri.startswith('s3://'):
raise ValueError("ç¡å¹ãªS3 URI")
parts = s3_uri[5:].split('/', 1)
bucket = parts[0]
prefix = parts[1] if len(parts) > 1 else ''
local_path = Path(local_path)
if not local_path.exists():
logging.error(f"ãã¹ãååšããŸãã: {local_path}")
return
files_uploaded = 0
total_size = 0
# ååž°çã«ãã¡ã€ã«ãã¢ããããŒã
for file_path in local_path.rglob('*'):
if file_path.is_file():
# S3ããŒãçæ
relative_path = file_path.relative_to(local_path)
s3_key = f"{prefix}/{relative_path}".replace('\\', '/')
# ãã¡ã€ã«ãµã€ãº
file_size = file_path.stat().st_size
try:
# ã¢ããããŒã
s3.upload_file(
str(file_path),
bucket,
s3_key,
ExtraArgs={'StorageClass': 'STANDARD_IA'}
)
logging.info(f"ã¢ããããŒã: {relative_path} ({file_size:,} bytes)")
files_uploaded += 1
total_size += file_size
except Exception as e:
logging.error(f"ã¢ããããŒã倱æ {relative_path}: {e}")
logging.info(f"â å®äº: {files_uploaded} ãã¡ã€ã« ({total_size:,} bytes)")
if __name__ == "__main__":
import sys
if len(sys.argv) < 3:
print("äœ¿çšæ³: python backup_to_s3.py
sys.exit(1)
local_path = sys.argv[1]
s3_uri = sys.argv[2]
backup_to_s3(local_path, s3_uri)
```
5. ç£èŠã»ã¢ã©ãŒã
#### ãµãŒããŒç£èŠïŒPowerShellïŒ
```powershell
# ãµãŒããŒãªãœãŒã¹ç£èŠã¹ã¯ãªãã
# 䜿çšäŸ: .\monitor_server.ps1
# éŸå€èšå®
$CPU_THRESHOLD = 80
$MEMORY_THRESHOLD = 85
$DISK_THRESHOLD = 90
# ãã°ãã¡ã€ã«
$LOG_FILE = "C:\Logs\server_monitor.log"
function Write-Log {
param($Message)
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
"$timestamp - $Message" | Out-File -FilePath $LOG_FILE -Append
Write-Host "$timestamp - $Message"
}
function Send-Alert {
param($Subject, $Body)
# ããã«ã¡ãŒã«éä¿¡ãSlackéç¥ã®ããžãã¯ãå®è£
Write-Log "ALERT: $Subject - $Body"
}
# CPU䜿çšçãã§ãã¯
$cpu = (Get-Counter '\Processor(_Total)\% Processor Time').CounterSamples.CookedValue
if ($cpu -gt $CPU_THRESHOLD) {
Send-Alert "CPU䜿çšçãé«ã" "CPU䜿çšç: $([math]::Round($cpu, 2))%"
}
# ã¡ã¢ãªäœ¿çšçãã§ãã¯
$os = Get-CimInstance Win32_OperatingSystem
$totalMemory = $os.TotalVisibleMemorySize
$freeMemory = $os.FreePhysicalMemory
$usedMemoryPercent = (($totalMemory - $freeMemory) / $totalMemory) * 100
if ($usedMemoryPercent -gt $MEMORY_THRESHOLD) {
Send-Alert "ã¡ã¢ãªäœ¿çšçãé«ã" "ã¡ã¢ãªäœ¿çšç: $([math]::Round($usedMemoryPercent, 2))%"
}
# ãã£ã¹ã¯äœ¿çšçãã§ãã¯
Get-PSDrive -PSProvider FileSystem | Where-Object { $_.Used -gt 0 } | ForEach-Object {
$usedPercent = ($_.Used / ($_.Used + $_.Free)) * 100
if ($usedPercent -gt $DISK_THRESHOLD) {
Send-Alert "ãã£ã¹ã¯äœ¿çšçãé«ã" "ãã©ã€ã $($_.Name): $([math]::Round($usedPercent, 2))%"
}
}
# ã¹ããŒã¿ã¹ãã°
Write-Log "ç£èŠå®è¡å®äº - CPU: $([math]::Round($cpu, 2))% | Memory: $([math]::Round($usedMemoryPercent, 2))%"
```
6. CI/CDã¹ã¯ãªãã
#### ãã«ãïŒãããã€ïŒBashïŒ
```bash
#!/bin/bash
# Node.jsã¢ããªã®ãã«ãïŒãããã€
# 䜿çšäŸ: ./deploy.sh production
set -euo pipefail
ENV="${1:-staging}"
PROJECT_NAME="myapp"
BUILD_DIR="dist"
DEPLOY_SERVER="user@production-server.com"
DEPLOY_PATH="/var/www/$PROJECT_NAME"
log() {
echo "[$(date +'%Y-%m-%d %H:%M:%S')] $1"
}
# ç°å¢å€æ°èªã¿èŸŒã¿
if [ -f ".env.$ENV" ]; then
log "ç°å¢å€æ°èªã¿èŸŒã¿: .env.$ENV"
export $(cat ".env.$ENV" | xargs)
fi
# äŸåé¢ä¿ã€ã³ã¹ããŒã«
log "äŸåé¢ä¿ã€ã³ã¹ããŒã«äž..."
npm ci
# ãã¹ãå®è¡
log "ãã¹ãå®è¡äž..."
npm test
if [ $? -ne 0 ]; then
log "â ãã¹ã倱æ - ãããã€äžæ¢"
exit 1
fi
# ãã«ã
log "ãã«ãå®è¡äž..."
npm run build
if [ ! -d "$BUILD_DIR" ]; then
log "â ãã«ã倱æ - $BUILD_DIR ãèŠã€ãããŸãã"
exit 1
fi
# ããã¯ã¢ãã
log "ãããã€å ã§ããã¯ã¢ããäœæäž..."
ssh "$DEPLOY_SERVER" "cd $DEPLOY_PATH && tar -czf backup_$(date +%Y%m%d_%H%M%S).tar.gz * || true"
# ãããã€
log "ãããã€äž: $ENV"
rsync -avz --delete "$BUILD_DIR/" "$DEPLOY_SERVER:$DEPLOY_PATH/"
# ãµãŒãã¹åèµ·å
log "ãµãŒãã¹åèµ·åäž..."
ssh "$DEPLOY_SERVER" "sudo systemctl restart $PROJECT_NAME"
# ãã«ã¹ãã§ãã¯
log "ãã«ã¹ãã§ãã¯å®è¡äž..."
sleep 5
HEALTH_URL="https://production-server.com/health"
STATUS=$(curl -s -o /dev/null -w "%{http_code}" "$HEALTH_URL")
if [ "$STATUS" == "200" ]; then
log "â ãããã€æå - ãã«ã¹ãã§ãã¯OK"
else
log "â ãã«ã¹ãã§ãã¯å€±æ (HTTP $STATUS)"
exit 1
fi
```
ã¹ã±ãžã¥ãŒã«å®è¡
cronèšå®äŸ
```bash
# æ¯æ¥åå2æã«ããã¯ã¢ããå®è¡
0 2 * /path/to/backup_script.sh >> /var/log/backup.log 2>&1
# æ¯æ0åã«ãã°åæ
0 /usr/bin/python3 /path/to/analyze_logs.py
# 5åããšã«ç£èŠã¹ã¯ãªããå®è¡
/5 * /path/to/monitor_server.sh
# æ¯é±æææ¥åå3æã«ã¯ãªãŒã³ã¢ãã
0 3 1 /path/to/cleanup_old_files.sh
```
Windowsã¿ã¹ã¯ã¹ã±ãžã¥ãŒã©ïŒPowerShellïŒ
```powershell
# ã¿ã¹ã¯ã¹ã±ãžã¥ãŒã©ã«ãžã§ããç»é²
$action = New-ScheduledTaskAction -Execute "PowerShell.exe" `
-Argument "-File C:\Scripts\backup.ps1"
$trigger = New-ScheduledTaskTrigger -Daily -At 2am
$principal = New-ScheduledTaskPrincipal -UserId "SYSTEM" `
-LogonType ServiceAccount -RunLevel Highest
Register-ScheduledTask -TaskName "DailyBackup" `
-Action $action `
-Trigger $trigger `
-Principal $principal `
-Description "æ¯æ¥åå2æã«ããã¯ã¢ãããå®è¡"
```
ãšã©ãŒãã³ããªã³ã°
ãªãã©ã€æ©èœïŒPythonïŒ
```python
import time
from functools import wraps
def retry(max_attempts=3, delay=1, backoff=2):
"""ãªãã©ã€ãã³ã¬ãŒã¿"""
def decorator(func):
@wraps(func)
def wrapper(args, *kwargs):
attempts = 0
current_delay = delay
while attempts < max_attempts:
try:
return func(args, *kwargs)
except Exception as e:
attempts += 1
if attempts >= max_attempts:
raise
print(f"ãšã©ãŒ: {e}")
print(f"ãªãã©ã€ {attempts}/{max_attempts} - {current_delay}ç§åŸã«å詊è¡...")
time.sleep(current_delay)
current_delay *= backoff
return wrapper
return decorator
@retry(max_attempts=3, delay=2, backoff=2)
def fetch_data_from_api(url):
import requests
response = requests.get(url, timeout=10)
response.raise_for_status()
return response.json()
```
䜿çšäŸ
åºæ¬çãªäœ¿ãæ¹
```
ç»åãã¡ã€ã«ãæ¥ä»é ã«ãªããŒã ããBashã¹ã¯ãªãããçæããŠãã ããã
```
å ·äœçãªã¿ã¹ã¯
```
以äžã®èŠä»¶ãæºããPythonã¹ã¯ãªãããçæããŠãã ããïŒ
ã¿ã¹ã¯: ããŠã³ããŒããã©ã«ããæ¡åŒµåå¥ã«æŽç
èŠä»¶:
- æ¡åŒµåããšã«ãã©ã«ããäœæïŒImages, Documents, VideosçïŒ
- ååãã¡ã€ã«ã¯ã¿ã€ã ã¹ã¿ã³ããä»äž
- ãã°åºå
- ãã©ã€ã©ã³æ©èœ
åºå: Python 3.8以äž
```
API飿ºã¹ã¯ãªãã
```
GitHub APIã䜿çšããŠããŠãŒã¶ãŒã®ãªããžããªçµ±èšãååŸããã¹ã¯ãªãããçæããŠãã ããïŒ
æ©èœ:
- ãªããžããªäžèЧååŸ
- ã¹ã¿ãŒæ°ããã©ãŒã¯æ°éèš
- 䜿çšèšèªã®çµ±èš
- JSON圢åŒã§ã¬ããŒãåºå
- èªèšŒããŒã¯ã³å¯Ÿå¿
èšèª: Python
```
ããã¯ã¢ããèªåå
```
PostgreSQLããŒã¿ããŒã¹ã®ããã¯ã¢ããã¹ã¯ãªãããçæããŠãã ããïŒ
èŠä»¶:
- å§çž®ããã¯ã¢ããïŒgzipïŒ
- ã¿ã€ã ã¹ã¿ã³ãä»ããã¡ã€ã«å
- 7æ¥ä»¥äžåã®ããã¯ã¢ãããèªååé€
- ãã°åºå
- ãšã©ãŒãã³ããªã³ã°
èšèª: Bash
åºå: cronèšå®äŸãå«ããŠ
```
ãã¹ããã©ã¯ãã£ã¹
- ãšã©ãŒãã³ããªã³ã°: ãã¹ãŠã®å€éšã³ãã³ãã»APIåŒã³åºãã«ãšã©ãŒåŠç
- ãã°åºå: å®è¡ç¶æ³ã詳现ã«ãã°
- ã¹ãçæ§: è€æ°åå®è¡ããŠãå®å š
- ãã©ã€ã©ã³: å®éã®åŠçåã«ç¢ºèªå¯èœ
- èšå®ã®å€éšå: ããŒãã³ãŒããããç°å¢å€æ°ãèšå®ãã¡ã€ã«ã䜿çš
- ããã¯ã¢ãã: ç Žå£çæäœã®åã«ããã¯ã¢ãã
- éç¥: éèŠãªåŠçã®æå/倱æãéç¥
- ããã¥ã¡ã³ã: äœ¿çšæ¹æ³ãã³ã¡ã³ããREADMEã«èšèŒ
ããŒãžã§ã³æ å ±
- ã¹ãã«ããŒãžã§ã³: 1.0.0
- æçµæŽæ°: 2025-11-22
---
䜿çšäŸãŸãšã
ã·ã³ãã«ãªèªåå
```
ãã¡ã€ã«ããªããŒã ããã¹ã¯ãªãããäœæããŠãã ããã
```
詳现ãªèŠä»¶
```
以äžã®ã¿ã¹ã¯ãèªååããã¹ã¯ãªãããçæããŠãã ããïŒ
{詳现ãªèŠä»¶}
èšèª: Python/Bash/PowerShell
ãšã©ãŒãã³ããªã³ã°: å«ã
ãã°åºå: å«ã
```
ãã®ã¹ãã«ã§ãæ¥ã ã®ç¹°ãè¿ãäœæ¥ãèªååããŸãããïŒ
More from this repository10
Generates concise summaries of documents by extracting key information and condensing text into a more digestible format.
Generates creative algorithmic art using p5.js, creating unique visual designs with patterns, fractals, and dynamic animations.
Generates, optimizes, and explains SQL queries with best practices, providing intelligent database query solutions across multiple database platforms.
Generates PlantUML diagrams (class, sequence, component) to visually represent system architecture and UML models.
Generates Azure Pipelines YAML configurations automatically for CI/CD workflows, supporting multi-stage builds and deployments across different environments.
Validates YAML pipeline configurations, checking syntax, structure, and potential errors before deployment.
Assists Kubernetes users by generating, validating, and explaining Kubernetes manifests and configurations with AI-powered insights.
I apologize, but I cannot generate a description without seeing the actual context or details about the "meeting-notes" skill from the repository. Could you provide more information about what this...
Automatically generates comprehensive changelogs from git commit history, categorizing changes and creating structured release notes in multiple formats.
Creates isolated Git worktrees with smart directory selection and safety verification for feature work and branch management.