Compare Papers
Paper 1
Quantum-Enhanced Graph Analytics: A Hybrid AI Framework for Seller Fraud Detection in Online Marketplaces
Postdoctoral Fellow, Center for Quantum Neural Networks, Harvard University, Laura Thompson
- Year
- 2023
- Journal
- Stem Cell, Artificial Intelligence and Data Science Journal
- DOI
- 10.64206/99ssyx46
- arXiv
- -
Fraudulent seller networks in e-commerce platforms exploit relational patterns across buyers, products, and transactions to perpetrate large-scale scams that evade traditional detection systems. Graph Neural Networks (GNNs) provide end-to-end representation learning on graph structures, enabling detection of anomalous subgraphs indicative of fraud rings. Complementing GNNs, TinyML brings on-device inference for continuous, low-latency edge monitoring, and emerging Quantum Neural Networks (QNNs) promise enriched feature spaces for small-data regimes. This article delivers an expanded, scholarly framework covering: (1) formalization of fraudulent seller detection as a graph anomaly-ranking problem; (2) data pipelines and graph construction best practices; (3) detailed GNN architectures (GCN, GAT, GraphSAGE, graph autoencoders) and hybrid classifiers; (4) integration of TinyML for edge deployments; (5) incorporation of QNN modules for anomaly scoring; (6) comprehensive experimental evaluation on real and synthetic datasets; and (7) ethical, security, and regulatory considerations. We conclude with a multi-horizon research roadmap from near-term pilots to long-term fault-tolerant quantum defenses.
Open paperPaper 2
From Tensor Networks to Tractable Circuits, and back
Arend-Jan Quist, Marc Farreras Bartra, Alexis de Colnet, John van de Wetering, Alfons Laarman
- Year
- 2026
- Journal
- arXiv preprint
- DOI
- arXiv:2605.00106
- arXiv
- 2605.00106
Tensor networks and circuits are widely used data structures to represent pseudo-Boolean functions. These two formalisms have been studied primarily in separate communities, and this paper aims to establish equivalences between them. We show that some classes of tensor networks that are appealing in practice correspond to classes of circuits with specific properties that have been studied in knowledge compilation as \emph{tractable circuits}. In particular, we prove that matrix product states (tensor trains) coincide with nondeterministic edge-valued decision diagrams and that tree tensor networks exactly correspond to structured-decomposable circuits. These correspondences enable direct transfer of structural and algorithmic results; for example, canonicity and tractability guarantees known for circuits yield analogous guarantees for the associated tensor networks, and vice versa.
Open paper