@@ -258,7 +258,7 @@ after a delimiter:
258258 data = ' a, b, c\n 1, 2, 3\n 4, 5, 6'
259259 print (data)
260260 pd.read_csv(StringIO(data), skipinitialspace = True )
261-
261+
262262 Moreover, ``read_csv `` ignores any completely commented lines:
263263
264264.. ipython :: python
@@ -2962,7 +2962,7 @@ Notes & Caveats
29622962 ``tables ``. The sizes of a string based indexing column
29632963 (e.g. *columns * or *minor_axis *) are determined as the maximum size
29642964 of the elements in that axis or by passing the parameter
2965- - Be aware that timezones (e.g., ``pytz.timezone('US/Eastern') ``)
2965+ - Be aware that timezones (e.g., ``pytz.timezone('US/Eastern') ``)
29662966 are not necessarily equal across timezone versions. So if data is
29672967 localized to a specific timezone in the HDFStore using one version
29682968 of a timezone library and that data is updated with another version, the data
@@ -3409,14 +3409,14 @@ Google BigQuery (Experimental)
34093409The :mod: `pandas.io.gbq ` module provides a wrapper for Google's BigQuery
34103410analytics web service to simplify retrieving results from BigQuery tables
34113411using SQL-like queries. Result sets are parsed into a pandas
3412- DataFrame with a shape and data types derived from the source table.
3413- Additionally, DataFrames can be appended to existing BigQuery tables if
3412+ DataFrame with a shape and data types derived from the source table.
3413+ Additionally, DataFrames can be appended to existing BigQuery tables if
34143414the destination table is the same shape as the DataFrame.
34153415
34163416For specifics on the service itself, see `here <https://developers.google.com/bigquery/ >`__
34173417
3418- As an example, suppose you want to load all data from an existing BigQuery
3419- table : `test_dataset.test_table ` into a DataFrame using the :func: `~pandas.io.read_gbq `
3418+ As an example, suppose you want to load all data from an existing BigQuery
3419+ table : `test_dataset.test_table ` into a DataFrame using the :func: `~pandas.io.read_gbq `
34203420function.
34213421
34223422.. code-block :: python
@@ -3447,14 +3447,14 @@ Finally, you can append data to a BigQuery table from a pandas DataFrame
34473447using the :func: `~pandas.io.to_gbq ` function. This function uses the
34483448Google streaming API which requires that your destination table exists in
34493449BigQuery. Given the BigQuery table already exists, your DataFrame should
3450- match the destination table in column order, structure, and data types.
3451- DataFrame indexes are not supported. By default, rows are streamed to
3452- BigQuery in chunks of 10,000 rows, but you can pass other chuck values
3453- via the ``chunksize `` argument. You can also see the progess of your
3454- post via the ``verbose `` flag which defaults to ``True ``. The http
3455- response code of Google BigQuery can be successful (200) even if the
3456- append failed. For this reason, if there is a failure to append to the
3457- table, the complete error response from BigQuery is returned which
3450+ match the destination table in column order, structure, and data types.
3451+ DataFrame indexes are not supported. By default, rows are streamed to
3452+ BigQuery in chunks of 10,000 rows, but you can pass other chuck values
3453+ via the ``chunksize `` argument. You can also see the progess of your
3454+ post via the ``verbose `` flag which defaults to ``True ``. The http
3455+ response code of Google BigQuery can be successful (200) even if the
3456+ append failed. For this reason, if there is a failure to append to the
3457+ table, the complete error response from BigQuery is returned which
34583458can be quite long given it provides a status for each row. You may want
34593459to start with smaller chuncks to test that the size and types of your
34603460dataframe match your destination table to make debugging simpler.
@@ -3470,9 +3470,9 @@ The BigQuery SQL query language has some oddities, see `here <https://developers
34703470
34713471While BigQuery uses SQL-like syntax, it has some important differences
34723472from traditional databases both in functionality, API limitations (size and
3473- qunatity of queries or uploads), and how Google charges for use of the service.
3473+ qunatity of queries or uploads), and how Google charges for use of the service.
34743474You should refer to Google documentation often as the service seems to
3475- be changing and evolving. BiqQuery is best for analyzing large sets of
3475+ be changing and evolving. BiqQuery is best for analyzing large sets of
34763476data quickly, but it is not a direct replacement for a transactional database.
34773477
34783478You can access the management console to determine project id's by:
0 commit comments