jpskill.com
💬 コミュニケーション コミュニティ 🟢 非エンジニアでもOK 👤 管理職・人事・カスタマー対応

💬 AIニュースAggregatorSl

ai-news-aggregator-sl

AIやテクノロジー関連の最新ニュースを、RSS

⏱ クレーム返信ドラフト 15分 → 2分

📺 まず動画で見る(YouTube)

▶ 【最新版】Claude(クロード)完全解説!20以上の便利機能をこの動画1本で全て解説 ↗

※ jpskill.com 編集部が参考用に選んだ動画です。動画の内容と Skill の挙動は厳密には一致しないことがあります。

📜 元の英語説明(参考)

Fetches AI & tech news (default) or any custom topic (crypto, geopolitics, etc.) from RSS feeds, Tavily search, Twitter/X, and YouTube. Writes an English editorial digest using OpenAI by default (or DeepSeek / Claude), then posts it to Discord. Supports any time range (today, last 3 days, last week). Trigger when user asks for news, a digest, trending topics, or YouTube updates on any subject.

🇯🇵 日本人クリエイター向け解説

一言でいうと

AIやテクノロジー関連の最新ニュースを、RSS

※ jpskill.com 編集部が日本のビジネス現場向けに補足した解説です。Skill本体の挙動とは独立した参考情報です。

⚡ おすすめ: コマンド1行でインストール(60秒)

下記のコマンドをコピーしてターミナル(Mac/Linux)または PowerShell(Windows)に貼り付けてください。 ダウンロード → 解凍 → 配置まで全自動。

🍎 Mac / 🐧 Linux
mkdir -p ~/.claude/skills && cd ~/.claude/skills && curl -L -o ai-news-aggregator-sl.zip https://jpskill.com/download/4341.zip && unzip -o ai-news-aggregator-sl.zip && rm ai-news-aggregator-sl.zip
🪟 Windows (PowerShell)
$d = "$env:USERPROFILE\.claude\skills"; ni -Force -ItemType Directory $d | Out-Null; iwr https://jpskill.com/download/4341.zip -OutFile "$d\ai-news-aggregator-sl.zip"; Expand-Archive "$d\ai-news-aggregator-sl.zip" -DestinationPath $d -Force; ri "$d\ai-news-aggregator-sl.zip"

完了後、Claude Code を再起動 → 普通に「動画プロンプト作って」のように話しかけるだけで自動発動します。

💾 手動でダウンロードしたい(コマンドが難しい人向け)
  1. 1. 下の青いボタンを押して ai-news-aggregator-sl.zip をダウンロード
  2. 2. ZIPファイルをダブルクリックで解凍 → ai-news-aggregator-sl フォルダができる
  3. 3. そのフォルダを C:\Users\あなたの名前\.claude\skills\(Win)または ~/.claude/skills/(Mac)へ移動
  4. 4. Claude Code を再起動

⚠️ ダウンロード・利用は自己責任でお願いします。当サイトは内容・動作・安全性について責任を負いません。

🎯 このSkillでできること

下記の説明文を読むと、このSkillがあなたに何をしてくれるかが分かります。Claudeにこの分野の依頼をすると、自動で発動します。

📦 インストール方法 (3ステップ)

  1. 1. 上の「ダウンロード」ボタンを押して .skill ファイルを取得
  2. 2. ファイル名の拡張子を .skill から .zip に変えて展開(macは自動展開可)
  3. 3. 展開してできたフォルダを、ホームフォルダの .claude/skills/ に置く
    • · macOS / Linux: ~/.claude/skills/
    • · Windows: %USERPROFILE%\.claude\skills\

Claude Code を再起動すれば完了。「このSkillを使って…」と話しかけなくても、関連する依頼で自動的に呼び出されます。

詳しい使い方ガイドを見る →
最終更新
2026-05-17
取得日時
2026-05-17
同梱ファイル
1

💬 こう話しかけるだけ — サンプルプロンプト

  • AI News Aggregator Sl で、お客様への返信文を作って
  • AI News Aggregator Sl を使って、社内向けアナウンスを書いて
  • AI News Aggregator Sl で、メールテンプレートを整備して

これをClaude Code に貼るだけで、このSkillが自動発動します。

📖 Claude が読む原文 SKILL.md(中身を展開)

この本文は AI(Claude)が読むための原文(英語または中国語)です。日本語訳は順次追加中。

🦞 AI News Aggregator

Collects news on any topic, writes an English editorial digest using your choice of AI provider, and posts it to Discord.

Default (AI topic): TechCrunch · The Verge · NYT Tech (RSS) + curated AI YouTube channels Custom topics: Tavily news search + YouTube topic search (no Shorts, sorted by views) AI providers: OpenAI (default) · DeepSeek · Anthropic Claude — switchable per request


Network Endpoints

Endpoint Purpose Condition
https://api.deepseek.com/chat/completions AI editorial summarisation Always (required)
https://discord.com/api/webhooks/... Post digest to Discord Always (required)
https://techcrunch.com/.../feed/ RSS news (AI topic) Default AI topic only
https://www.theverge.com/rss/... RSS news (AI topic) Default AI topic only
https://www.nytimes.com/svc/collections/... RSS news (AI topic) Default AI topic only
https://api.tavily.com/search Custom topic news search Only if TAVILY_API_KEY set
https://api.twitterapi.io/twitter/tweet/advanced_search Twitter search Only if TWITTERAPI_IO_KEY set
https://www.googleapis.com/youtube/v3/... YouTube search Only if YOUTUBE_API_KEY set

The script does not contact OpenAI endpoints. The openai package is used solely as an HTTP client pointed at https://api.deepseek.com. OPENAI_API_KEY is explicitly removed from the environment at startup.


Usage Examples

  • "Get today's AI news"
  • "Collect news about crypto"
  • "Last week's news about climate change"
  • "What's trending in AI today?"
  • "Get crypto news from the last 3 days using OpenAI"
  • "Show me recent Bitcoin YouTube videos"
  • "Summarise WWIII news with Claude"
  • "AI news using GPT-4o"
  • "AI news dry run" (preview without posting to Discord)
  • "Test my Discord webhook"

API Keys

Key Required Where to get it
DISCORD_WEBHOOK_URL ✅ Always Discord → Channel Settings → Integrations → Webhooks → Copy URL
DEEPSEEK_API_KEY If using DeepSeek (default) platform.deepseek.com/api_keys
OPENAI_API_KEY If using OpenAI platform.openai.com/api-keys
ANTHROPIC_API_KEY If using Claude console.anthropic.com → API Keys
TAVILY_API_KEY For custom topics app.tavily.com
TWITTERAPI_IO_KEY Optional twitterapi.io
YOUTUBE_API_KEY Optional console.cloud.google.com → YouTube Data API v3

AI Providers & Models

Provider --provider value Default model Best for
OpenAI openai (default) gpt-4o-mini Quality, reliability
DeepSeek deepseek deepseek-chat Cost-effective, fast
Claude claude claude-3-5-haiku-20241022 Nuanced writing

Override per request using the --provider flag. Set a permanent non-default with openclaw config set env.AI_PROVIDER '"deepseek"'. Override the model with --model (e.g. --model gpt-4o or --model claude-3-5-sonnet-20241022).


Implementation

IMPORTANT: Always run news_aggregator.py using the steps below. Do NOT search the web manually or improvise a response — the script handles all fetching, summarisation, and Discord posting.

Step 1 — Locate the script

The script is bundled with this skill. Find it:

SKILL_DIR=$(ls -d ~/.openclaw/skills/ai-news-aggregator-sl 2>/dev/null || ls -d ~/.openclaw/skills/news-aggregator 2>/dev/null)
SCRIPT="$SKILL_DIR/news_aggregator.py"
echo "Script: $SCRIPT"
ls "$SCRIPT"

Step 2 — Check uv is available

which uv && uv --version || echo "uv not found"

If uv is not found, ask the user to install it from their system package manager or from https://docs.astral.sh/uv/getting-started/installation/. Do not run a curl-pipe-sh command on the user's behalf.

Step 3 — API keys

Env vars are passed automatically by OpenClaw from its config. No .env file is needed.

Verify the required keys are set (without revealing values):

[[ -n "$OPENAI_API_KEY" ]]      && echo "OPENAI_API_KEY: set"      || echo "OPENAI_API_KEY: MISSING (required for default provider)"
[[ -n "$DISCORD_WEBHOOK_URL" ]] && echo "DISCORD_WEBHOOK_URL: set" || echo "DISCORD_WEBHOOK_URL: MISSING"

If any are missing, ask the user to register them:

openclaw config set env.OPENAI_API_KEY '<key>'
openclaw config set env.DISCORD_WEBHOOK_URL '<url>'
# Optional alternatives:
openclaw config set env.DEEPSEEK_API_KEY '<key>'
openclaw config set env.ANTHROPIC_API_KEY '<key>'

Step 4 — Parse the request

Extract topic, days, and provider from what the user said:

For AI provider:

User said --provider --model
"use OpenAI" / "with GPT" / "using ChatGPT" --provider openai (omit)
"use Claude" / "with Anthropic" --provider claude (omit)
"use DeepSeek" / nothing specified (omit — default) (omit)
"use GPT-4o" / "with gpt-4o" --provider openai --model gpt-4o
"use claude sonnet" --provider claude --model claude-3-5-sonnet-20241022
"use deepseek reasoner" --provider deepseek --model deepseek-reasoner

Extract topic and days from what the user said:

User said --topic --days
"AI news" / "tech news" / nothing specific (omit — default AI) 1
"crypto news" --topic "crypto" 1
"news about climate change" --topic "climate change" 1
"last week's crypto news" --topic "crypto" 7
"last 3 days of Bitcoin news" --topic "Bitcoin" 3
"yesterday's AI news" (omit topic) 1
"this week in AI" (omit topic) 7

For report type:

User said flag to add
"news" / "articles" / "digest" --report news
"trending" / "Twitter" / "YouTube" --report trending
"dry run" / "preview" / "don't post" --dry-run
"test Discord" / "test webhook" --test-discord
anything else (omit — runs all)

Step 5 — Run with uv

uv run automatically installs all dependencies from the script's inline metadata — no venv setup needed.

uv run "$SCRIPT" [--topic "TOPIC"] [--days N] [--report TYPE] [--provider PROVIDER] [--model MODEL] [--dry-run]

Examples:

# AI news today — DeepSeek (default)
uv run "$SCRIPT"

# Crypto news using OpenAI
uv run "$SCRIPT" --topic "crypto" --provider openai

# Last week's climate news using Claude
uv run "$SCRIPT" --topic "climate change" --days 7 --provider claude

# Use a specific model
uv run "$SCRIPT" --topic "Bitcoin" --provider openai --model gpt-4o

# Trending AI on Twitter and YouTube
uv run "$SCRIPT" --report trending

# Preview without posting to Discord
uv run "$SCRIPT" --topic "Bitcoin" --dry-run

# Test webhook connection
uv run "$SCRIPT" --test-discord

Step 6 — Report back

Tell the user what was posted to Discord, how many items were found per source, and note any skipped sources (e.g. "YouTube skipped — YOUTUBE_API_KEY not set").