jpskill.com
✍️ ライティング コミュニティ 🟡 少し慣れが必要 👤 ライター・マーケ・広報

✍️ Research Summarizer

research-summarizer

学術論文やウェブ記事、報告書などの研究資料から主要な知見を抽出し、比較分析や引用文献の作成を行うためのSkillです。

⏱ プレスリリース 半日 → 15分
📜 元の英語説明(参考)

Structured research summarization agent skill for non-dev users. Handles academic papers, web articles, reports, and documentation. Extracts key findings, generates comparative analyses, and produces properly formatted citations. Use when: user wants to summarize a research paper, compare multiple sources, extract citations from documents, or create structured research briefs. Plugin for Claude Code, Codex, Gemini CLI, and OpenClaw.

🇯🇵 日本人クリエイター向け解説

一言でいうと

学術論文やウェブ記事、報告書などの研究資料から主要な知見を抽出し、比較分析や引用文献の作成を行うためのSkillです。

※ jpskill.com 編集部が日本のビジネス現場向けに補足した解説です。Skill本体の挙動とは独立した参考情報です。

⚡ おすすめ: コマンド1行でインストール(60秒)

下記のコマンドをコピーしてターミナル(Mac/Linux)または PowerShell(Windows)に貼り付けてください。 ダウンロード → 解凍 → 配置まで全自動。

🍎 Mac / 🐧 Linux
mkdir -p ~/.claude/skills && cd ~/.claude/skills && curl -L -o research-summarizer.zip https://jpskill.com/download/5307.zip && unzip -o research-summarizer.zip && rm research-summarizer.zip
🪟 Windows (PowerShell)
$d = "$env:USERPROFILE\.claude\skills"; ni -Force -ItemType Directory $d | Out-Null; iwr https://jpskill.com/download/5307.zip -OutFile "$d\research-summarizer.zip"; Expand-Archive "$d\research-summarizer.zip" -DestinationPath $d -Force; ri "$d\research-summarizer.zip"

完了後、Claude Code を再起動 → 普通に「動画プロンプト作って」のように話しかけるだけで自動発動します。

💾 手動でダウンロードしたい(コマンドが難しい人向け)
  1. 1. 下の青いボタンを押して research-summarizer.zip をダウンロード
  2. 2. ZIPファイルをダブルクリックで解凍 → research-summarizer フォルダができる
  3. 3. そのフォルダを C:\Users\あなたの名前\.claude\skills\(Win)または ~/.claude/skills/(Mac)へ移動
  4. 4. Claude Code を再起動

⚠️ ダウンロード・利用は自己責任でお願いします。当サイトは内容・動作・安全性について責任を負いません。

🎯 このSkillでできること

下記の説明文を読むと、このSkillがあなたに何をしてくれるかが分かります。Claudeにこの分野の依頼をすると、自動で発動します。

📦 インストール方法 (3ステップ)

  1. 1. 上の「ダウンロード」ボタンを押して .skill ファイルを取得
  2. 2. ファイル名の拡張子を .skill から .zip に変えて展開(macは自動展開可)
  3. 3. 展開してできたフォルダを、ホームフォルダの .claude/skills/ に置く
    • · macOS / Linux: ~/.claude/skills/
    • · Windows: %USERPROFILE%\.claude\skills\

Claude Code を再起動すれば完了。「このSkillを使って…」と話しかけなくても、関連する依頼で自動的に呼び出されます。

詳しい使い方ガイドを見る →
最終更新
2026-05-17
取得日時
2026-05-17
同梱ファイル
5

💬 こう話しかけるだけ — サンプルプロンプト

  • Research Summarizer で、自社の新サービスを紹介する記事を書いて
  • Research Summarizer で、SNS投稿用に短く言い直して
  • Research Summarizer を使って、過去の記事を最新版にアップデート

これをClaude Code に貼るだけで、このSkillが自動発動します。

📖 Claude が読む原文 SKILL.md(中身を展開)

この本文は AI(Claude)が読むための原文(英語または中国語)です。日本語訳は順次追加中。

Research Summarizer

Read less. Understand more. Cite correctly.

Structured research summarization workflow that turns dense source material into actionable briefs. Built for product managers, analysts, founders, and anyone who reads more than they should have to.

Not a generic "summarize this" — a repeatable framework that extracts what matters, compares across sources, and formats citations properly.


Slash Commands

Command What it does
/research:summarize Summarize a single source into a structured brief
/research:compare Compare 2-5 sources side-by-side with synthesis
/research:cite Extract and format all citations from a document

When This Skill Activates

Recognize these patterns from the user:

  • "Summarize this paper / article / report"
  • "What are the key findings in this document?"
  • "Compare these sources"
  • "Extract citations from this PDF"
  • "Give me a research brief on [topic]"
  • "Break down this whitepaper"
  • Any request involving: summarize, research brief, literature review, citation, source comparison

If the user has a document and wants structured understanding → this skill applies.


Workflow

/research:summarize — Single Source Summary

  1. Identify source type

    • Academic paper → use IMRAD structure (Introduction, Methods, Results, Analysis, Discussion)
    • Web article → use claim-evidence-implication structure
    • Technical report → use executive summary structure
    • Documentation → use reference summary structure
  2. Extract structured brief

    Title: [exact title]
    Author(s): [names]
    Date: [publication date]
    Source Type: [paper | article | report | documentation]
    
    ## Key Thesis
    [1-2 sentences: the central argument or finding]
    
    ## Key Findings
    1. [Finding with supporting evidence]
    2. [Finding with supporting evidence]
    3. [Finding with supporting evidence]
    
    ## Methodology
    [How they arrived at these findings — data sources, sample size, approach]
    
    ## Limitations
    - [What the source doesn't cover or gets wrong]
    
    ## Actionable Takeaways
    - [What to do with this information]
    
    ## Notable Quotes
    > "[Direct quote]" (p. X)
  3. Assess quality

    • Source credibility (peer-reviewed, reputable outlet, primary vs secondary)
    • Evidence strength (data-backed, anecdotal, theoretical)
    • Recency (when published, still relevant?)
    • Bias indicators (funding source, author affiliation, methodology gaps)

/research:compare — Multi-Source Comparison

  1. Collect sources (2-5 documents)

  2. Summarize each using the single-source workflow above

  3. Build comparison matrix

    | Dimension        | Source A        | Source B        | Source C        |
    |------------------|-----------------|-----------------|-----------------|
    | Central Thesis   | ...             | ...             | ...             |
    | Methodology      | ...             | ...             | ...             |
    | Key Finding      | ...             | ...             | ...             |
    | Sample/Scope     | ...             | ...             | ...             |
    | Credibility      | High/Med/Low    | High/Med/Low    | High/Med/Low    |
  4. Synthesize

    • Where do sources agree? (convergent findings = stronger signal)
    • Where do they disagree? (divergent findings = needs investigation)
    • What gaps exist across all sources?
    • What's the weight of evidence for each position?
  5. Produce synthesis brief

    ## Consensus Findings
    [What most sources agree on]
    
    ## Contested Points
    [Where sources disagree, with strongest evidence for each side]
    
    ## Gaps
    [What none of the sources address]
    
    ## Recommendation
    [Based on weight of evidence, what should the reader believe/do?]

/research:cite — Citation Extraction

  1. Scan document for all references, footnotes, in-text citations
  2. Extract and format using the requested style (APA 7 default)
  3. Classify citations by type:
    • Primary sources (original research, data)
    • Secondary sources (reviews, meta-analyses, commentary)
    • Tertiary sources (textbooks, encyclopedias)
  4. Output sorted bibliography with classification tags

Supported citation formats:

  • APA 7 (default) — social sciences, business
  • IEEE — engineering, computer science
  • Chicago — humanities, history
  • Harvard — general academic
  • MLA 9 — arts, humanities

Tooling

scripts/extract_citations.py

CLI utility for extracting and formatting citations from text.

Features:

  • Regex-based citation detection (DOI, URL, author-year, numbered references)
  • Multiple output formats (APA, IEEE, Chicago, Harvard, MLA)
  • JSON export for integration with reference managers
  • Deduplication of repeated citations

Usage:

# Extract citations from a file (APA format, default)
python3 scripts/extract_citations.py document.txt

# Specify format
python3 scripts/extract_citations.py document.txt --format ieee

# JSON output
python3 scripts/extract_citations.py document.txt --format apa --output json

# From stdin
cat paper.txt | python3 scripts/extract_citations.py --stdin

scripts/format_summary.py

CLI utility for generating structured research summaries.

Features:

  • Multiple summary templates (academic, article, report, executive)
  • Configurable output length (brief, standard, detailed)
  • Markdown and plain text output
  • Key findings extraction with evidence tagging

Usage:

# Generate structured summary template
python3 scripts/format_summary.py --template academic

# Brief executive summary format
python3 scripts/format_summary.py --template executive --length brief

# All templates listed
python3 scripts/format_summary.py --list-templates

# JSON output
python3 scripts/format_summary.py --template article --output json

Quality Assessment Framework

Rate every source on four dimensions:

Dimension High Medium Low
Credibility Peer-reviewed, established author Reputable outlet, known author Blog, unknown author, no review
Evidence Large sample, rigorous method Moderate data, sound approach Anecdotal, no data, opinion
Recency Published within 2 years 2-5 years old 5+ years, may be outdated
Objectivity No conflicts, balanced view Minor affiliations disclosed Funded by interested party, one-sided

Overall Rating:

  • 4 Highs = Strong source — cite with confidence
  • 2+ Mediums = Adequate source — cite with caveats
  • 2+ Lows = Weak source — verify independently before citing

Summary Templates

See references/summary-templates.md for:

  • Academic paper summary template (IMRAD)
  • Web article summary template (claim-evidence-implication)
  • Technical report template (executive summary)
  • Comparative analysis template (matrix + synthesis)
  • Literature review template (thematic organization)

See references/citation-formats.md for:

  • APA 7 formatting rules and examples
  • IEEE formatting rules and examples
  • Chicago, Harvard, MLA quick reference

Proactive Triggers

Flag these without being asked:

  • Source has no date → Note it. Undated sources lose credibility points.
  • Source contradicts other sources → Highlight the contradiction explicitly. Don't paper over disagreements.
  • Source is behind a paywall → Note limited access. Suggest alternatives if known.
  • User provides only one source for a compare → Ask for at least one more. Comparison needs 2+.
  • Citations are incomplete → Flag missing fields (year, author, title). Don't invent metadata.
  • Source is 5+ years old in a fast-moving field → Warn about potential obsolescence.

Installation

One-liner (any tool)

git clone https://github.com/alirezarezvani/claude-skills.git
cp -r claude-skills/product-team/research-summarizer ~/.claude/skills/

Multi-tool install

./scripts/convert.sh --skill research-summarizer --tool codex|gemini|cursor|windsurf|openclaw

OpenClaw

clawhub install cs-research-summarizer

Related Skills

  • product-analytics — Quantitative analysis. Complementary — use research-summarizer for qualitative sources, product-analytics for metrics.
  • competitive-teardown — Competitive research. Complementary — use research-summarizer for individual source analysis, competitive-teardown for market landscape.
  • content-production — Content writing. Research-summarizer feeds content-production — summarize sources first, then write.
  • product-discovery — Discovery frameworks. Complementary — research-summarizer for desk research, product-discovery for user research.

同梱ファイル

※ ZIPに含まれるファイル一覧。`SKILL.md` 本体に加え、参考資料・サンプル・スクリプトが入っている場合があります。