Skip to content

Commit 2705bb9

Browse files
authored
Added citation and paper URL (#357)
1 parent 2160f38 commit 2705bb9

File tree

2 files changed

+85
-11
lines changed

2 files changed

+85
-11
lines changed

CITATION.cff

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
cff-version: 1.2.0
2+
message: "If you use this software, please cite it as below."
3+
authors:
4+
- family-names: "Skarlinski"
5+
given-names: "Michael D."
6+
- family-names: "Cox"
7+
given-names: "Sam"
8+
- family-names: "Laurent"
9+
given-names: "Jon M."
10+
- family-names: "Braza"
11+
given-names: "James D."
12+
- family-names: "Hinks"
13+
given-names: "Michaela"
14+
- family-names: "Hammerling"
15+
given-names: "Michael J."
16+
- family-names: "Ponnapati"
17+
given-names: "Manvitha"
18+
- family-names: "Rodriques"
19+
given-names: "Samuel G."
20+
- family-names: "White"
21+
given-names: "Andrew D."
22+
title: "Language agents achieve superhuman synthesis of scientific knowledge"
23+
version: 2024
24+
doi: "10.xxxx/xxxxxx"
25+
date-released: 2024
26+
url: "https://paper.wikicrow.ai"
27+
preferred-citation:
28+
type: article
29+
authors:
30+
- family-names: "Skarlinski"
31+
given-names: "Michael D."
32+
- family-names: "Cox"
33+
given-names: "Sam"
34+
- family-names: "Laurent"
35+
given-names: "Jon M."
36+
- family-names: "Braza"
37+
given-names: "James D."
38+
- family-names: "Hinks"
39+
given-names: "Michaela"
40+
- family-names: "Hammerling"
41+
given-names: "Michael J."
42+
- family-names: "Ponnapati"
43+
given-names: "Manvitha"
44+
- family-names: "Rodriques"
45+
given-names: "Samuel G."
46+
- family-names: "White"
47+
given-names: "Andrew D."
48+
title: "Language agents achieve superhuman synthesis of scientific knowledge"
49+
journal: "preprint"
50+
year: 2024
51+
month: 9 # Adjust month if known

README.md

Lines changed: 34 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
[![tests](https://github.com/whitead/paper-qa/actions/workflows/tests.yml/badge.svg)](https://github.com/whitead/paper-qa)
55
[![PyPI version](https://badge.fury.io/py/paper-qa.svg)](https://badge.fury.io/py/paper-qa)
66

7-
PaperQA is a package for doing high-accuracy retrieval augmented generation (RAG) on PDFs or text files, with a focus on the scientific literature. See our 2023 [PaperQA paper](https://arxiv.org/abs/2312.07559) and our 2024 application paper[TODO] to see examples of PaperQA's superhuman performance in scientific tasks like question answering, summarization, and contradiction detection.
7+
PaperQA is a package for doing high-accuracy retrieval augmented generation (RAG) on PDFs or text files, with a focus on the scientific literature. See our [2024 application paper](https://paper.wikicrow.ai) to see examples of PaperQA's superhuman performance in scientific tasks like question answering, summarization, and contradiction detection.
88

99
## Quickstart
1010

@@ -18,17 +18,9 @@ pqa ask 'How can carbon nanotubes be manufactured at a large scale?'
1818

1919
### Example Output
2020

21-
Question: How can carbon nanotubes be manufactured at a large scale?
21+
Question: Has anyone designed neural networks that compute with proteins or DNA?
2222

23-
Carbon nanotubes can be manufactured at a large scale using the electric-arc technique (Journet6644). This technique involves creating an arc between two electrodes in a reactor under a helium atmosphere and using a mixture of a metallic catalyst and graphite powder in the anode. Yields of 80% of entangled carbon filaments can be achieved, which consist of smaller aligned SWNTs self-organized into bundle-like crystallites (Journet6644). Additionally, carbon nanotubes can be synthesized and self-assembled using various methods such as DNA-mediated self-assembly, nanoparticle-assisted alignment, chemical self-assembly, and electro-addressed functionalization (Tulevski2007). These methods have been used to fabricate large-area nanostructured arrays, high-density integration, and freestanding networks (Tulevski2007). 98% semiconducting CNT network solution can also be used and is separated from metallic nanotubes using a density gradient ultracentrifugation approach (Chen2014). The substrate is incubated in the solution and then rinsed with deionized water and dried with N2 air gun, leaving a uniform carbon network (Chen2014).
24-
25-
**References:**
26-
27-
Journet6644: Journet, Catherine, et al. "Large-scale production of single-walled carbon nanotubes by the electric-arc technique." nature 388.6644 (1997): 756-758.
28-
29-
Tulevski2007: Tulevski, George S., et al. "Chemically assisted directed assembly of carbon nanotubes for the fabrication of large-scale device arrays." Journal of the American Chemical Society 129.39 (2007): 11964-11968.
30-
31-
Chen2014: Chen, Haitian, et al. "Large-scale complementary macroelectronics using hybrid integration of carbon nanotubes and IGZO thin-film transistors." Nature communications 5.1 (2014): 4097.
23+
The claim that neural networks have been designed to compute with DNA is supported by multiple sources. The work by Qian, Winfree, and Bruck demonstrates the use of DNA strand displacement cascades to construct neural network components, such as artificial neurons and associative memories, using a DNA-based system (Qian2011Neural pages 1-2, Qian2011Neural pages 15-16, Qian2011Neural pages 54-56). This research includes the implementation of a 3-bit XOR gate and a four-neuron Hopfield associative memory, showcasing the potential of DNA for neural network computation. Additionally, the application of deep learning techniques to genomics, which involves computing with DNA sequences, is well-documented. Studies have applied convolutional neural networks (CNNs) to predict genomic features such as transcription factor binding and DNA accessibility (Eraslan2019Deep pages 4-5, Eraslan2019Deep pages 5-6). These models leverage DNA sequences as input data, effectively using neural networks to compute with DNA. While the provided excerpts do not explicitly mention protein-based neural network computation, they do highlight the use of neural networks in tasks related to protein sequences, such as predicting DNA-protein binding (Zeng2016Convolutional pages 1-2). However, the primary focus remains on DNA-based computation.
3224

3325
## What is PaperQA
3426

@@ -583,3 +575,34 @@ with open("my_docs.pkl", "wb") as f:
583575
with open("my_docs.pkl", "rb") as f:
584576
docs = pickle.load(f)
585577
```
578+
579+
## Citation
580+
581+
Please read and cite the following papers if you use this software:
582+
583+
```bibtex
584+
@article{skarlinski2024language,
585+
title={Language agents achieve superhuman synthesis of scientific knowledge},
586+
author={
587+
Michael D. Skarlinski and
588+
Sam Cox and
589+
Jon M. Laurent and
590+
James D. Braza and
591+
Michaela Hinks and
592+
Michael J. Hammerling and
593+
Manvitha Ponnapati and
594+
Samuel G. Rodriques and
595+
Andrew D. White},
596+
year={2024},
597+
journal={preprint},
598+
url={https://paper.wikicrow.ai}
599+
}
600+
601+
602+
@article{lala2023paperqa,
603+
title={PaperQA: Retrieval-Augmented Generative Agent for Scientific Research},
604+
author={L{\'a}la, Jakub and O'Donoghue, Odhran and Shtedritski, Aleksandar and Cox, Sam and Rodriques, Samuel G and White, Andrew D},
605+
journal={arXiv preprint arXiv:2312.07559},
606+
year={2023}
607+
}
608+
```

0 commit comments

Comments
 (0)