Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,11 @@ BigBanyanTree is an initiative to empower colleges to set up their data engineer
|
|
17 |
# Content
|
18 |
Each `arrow` file contains a table with fields extracted from Common Crawl WARC files.
|
19 |
|
|
|
|
|
|
|
|
|
|
|
20 |
## <span style="color:red">⚠️ WARNING ⚠️</span>
|
21 |
|
22 |
The **URLs** and **IP addresses** extracted in this dataset are sourced from **publicly available Common Crawl data dumps**. Please be aware that:
|
|
|
17 |
# Content
|
18 |
Each `arrow` file contains a table with fields extracted from Common Crawl WARC files.
|
19 |
|
20 |
+
The datasets provided are derived from processing randomly sampled 900 WARC files from the [2018-51 CommonCrawl dump](https://data.commoncrawl.org/crawl-data/CC-MAIN-2018-51/index.html).
|
21 |
+
|
22 |
+
The MaxMind database used to enrich WARC data with geolocation information is GeoLite2-City_20240903 (released on 3rd Sept. 2024).
|
23 |
+
|
24 |
+
|
25 |
## <span style="color:red">⚠️ WARNING ⚠️</span>
|
26 |
|
27 |
The **URLs** and **IP addresses** extracted in this dataset are sourced from **publicly available Common Crawl data dumps**. Please be aware that:
|