Paper em destaque
Algorithmic Authority: A Practitioner Framework for Generative Engine Optimization Based on a 7-Day Implementation Sprint
GEO Enterprise Framework · 10 camadas · action research 17–23/03/2026
SSRN eLibrary · Quantitative Marketing eJournal + Information Technology & Systems eJournal
The emergence of Large Language Models (LLMs) as primary information intermediaries has created an urgent need for frameworks that address how digital entities achieve visibility within AI-generated responses. This paper introduces the GEO Enterprise Framework, a 10-layer practitioner model for Generative Engine Optimization derived from action research conducted during a 7-day implementation sprint (March 17–23, 2026). The sprint produced 206 commits across 8 repositories, generated 61 pages and 92 indexable URLs, implemented 29 Schema.org types within a single JSON-LD @graph architecture, and achieved an Entity Consistency Score improvement from 20% to 80% — all at zero infrastructure cost using free-tier platforms. The framework formalizes several novel constructs: the Invisible Excellence Paradox, Algorithmic Citability, and the Entity Consistency Score (ECS).
Citação do paper em destaque
ABNT (NBR 6023)
CARAMASCHI, A. Algorithmic Authority: A Practitioner Framework for Generative Engine Optimization Based on a 7-Day Implementation Sprint. SSRN, 20 abr. 2026. DOI: 10.2139/ssrn.6460680. Disponível em: https://ssrn.com/abstract=6460680.
APA 7th
Caramaschi, A. (2026). Algorithmic Authority: A Practitioner Framework for Generative Engine Optimization Based on a 7-Day Implementation Sprint. SSRN. https://doi.org/10.2139/ssrn.6460680
BibTeX
@article{caramaschi2026algorithmic,
title = {Algorithmic Authority: A Practitioner Framework for Generative Engine Optimization Based on a 7-Day Implementation Sprint},
author = {Caramaschi, Alexandre},
year = {2026},
month = apr,
journal= {SSRN Electronic Journal},
doi = {10.2139/ssrn.6460680},
url = {https://ssrn.com/abstract=6460680},
note = {ORCID: 0009-0004-9150-485X}
}Working papers
Três estudos longitudinais derivados do pipeline GEO Multi-Vertical Citation Tracker. Coleta 2× ao dia entre 08/04 e 06/07/2026.
How LLMs Cite Entities Across Industry Verticals: A 90-Day Empirical Study
Target: ArXiv (cs.IR) · draft julho/2026
Longitudinal empirical study tracking citation patterns of 69 Brazilian entities across 4 verticals (Fintech, Retail, Healthcare, Technology) by 4 LLMs (GPT-4o-mini, Claude Haiku 4.5, Gemini 2.5 Flash, Perplexity Sonar). Target ~25.920 observations via 288 daily queries over 90 days. Methodology includes statistical rigor with multiple-testing corrections and effect-size calculations across seven distinct hypothesis tests.
GEO vs SEO: Source Divergence Between Generative and Traditional Search
Target: SIGIR/WWW workshop · 2027
Comparative analysis of source overlap between ranked SERP results (traditional SEO) and LLM-cited sources for identical queries. Explores structural differences in authority signals: traditional ranking favors PageRank-style link graphs, whereas LLMs exhibit preference for editorial authority and structured data. Quantifies divergence metrics suitable as KPIs for practitioners transitioning from SEO to GEO.
Industry-Specific Patterns in AI Citation: A Multi-Vertical Analysis
Target: Information Sciences (Q1)
Sector-specific analysis of citation behavior: why LLMs cite fintechs more frequently than healthcare startups, the role of regulatory discourse, and how vertical editorial ecosystems shape algorithmic visibility. Includes a reproducible benchmark and open dataset derived from the 90-day longitudinal collection.
Perfis acadêmicos
0009-0004-9150-485X
Identificador único de pesquisador (Open Researcher and Contributor ID).
author=10853648
Página oficial de autor na Social Science Research Network (Elsevier).
alexandrebrt14-sys/papers
Pipeline de coleta longitudinal e código dos working papers.
Q138755507
Entidade estruturada para consumo por Knowledge Graphs e LLMs.
Perguntas frequentes
Sobre o paper, a metodologia e o programa de publicação.