File size: 3,078 Bytes
065fee7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 |
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Amazon Redshift Feature Support"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Overview\n",
"`redshift_connector` aims to support the latest and greatest features provided by Amazon Redshift so you can get the most out of your data."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## COPY and UNLOAD Support - Amazon S3\n",
"`redshift_connector` provides the ability to `COPY` and `UNLOAD` data from an Amazon S3 bucket. Shown below is a sample workflow which copies and unloads data from an Amazon S3 bucket"
]
},
{
"cell_type": "markdown",
"metadata": {
"pycharm": {
"name": "#%% md\n"
}
},
"source": [
"1. Upload the following text file to an Amazon S3 bucket and name it `category_csv.txt`\n",
"\n",
"```text\n",
" 12,Shows,Musicals,Musical theatre\n",
" 13,Shows,Plays,\"All \"\"non-musical\"\" theatre\"\n",
" 14,Shows,Opera,\"All opera, light, and \"\"rock\"\" opera\"\n",
" 15,Concerts,Classical,\"All symphony, concerto, and choir concerts\"\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"source": [
"import redshift_connector\n",
"\n",
"with redshift_connector.connect(\n",
" host='examplecluster.abc123xyz789.us-west-1.redshift.amazonaws.com',\n",
" database='dev',\n",
" user='awsuser',\n",
" password='my_password'\n",
") as conn:\n",
" with conn.cursor() as cursor:\n",
" cursor.execute(\"create table category (catid int, cargroup varchar, catname varchar, catdesc varchar)\")\n",
" cursor.execute(\"copy category from 's3://testing/category_csv.txt' iam_role 'arn:aws:iam::123:role/RedshiftCopyUnload' csv;\")\n",
" cursor.execute(\"select * from category\")\n",
" print(cursor.fetchall())\n",
" cursor.execute(\"unload ('select * from category') to 's3://testing/unloaded_category_csv.txt' iam_role 'arn:aws:iam::123:role/RedshiftCopyUnload' csv;\")\n",
" print('done')\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"After executing the above code block, we can see the requested data was unloaded into the following file, `unloaded_category_csv.text0000_part00`, in the specified Amazon s3 bucket\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.7"
}
},
"nbformat": 4,
"nbformat_minor": 1
}
|