Compare commits

...

108 Commits

Author SHA1 Message Date
Mario de Frutos
0533018326 Merge pull request #298 from CartoDB/development
Release Python 0.9.4
2016-10-27 16:21:16 +02:00
Mario de Frutos
e380d51bec Merge pull request #296 from CartoDB/add_timeouts_requests
Release python 0.9.4
2016-10-27 16:19:12 +02:00
Mario de Frutos
6058960ec5 Added timeout for all the third-party connections 2016-10-27 16:17:37 +02:00
Mario de Frutos
336d8be977 Hotfix: Typo in status code check 2016-10-26 17:49:07 +02:00
Mario de Frutos
75557837b0 Merge pull request #295 from CartoDB/development
Release python library 0.9.3
2016-10-26 16:58:39 +02:00
Mario de Frutos
e7c35457e1 Merge pull request #294 from CartoDB/293_matrix_without_json
Fix empty response from matrix
2016-10-26 16:41:17 +02:00
Mario de Frutos
80963e2589 504 errors return empty data instead of raise exception
Due to some problems in Mapzen, we're receiving 504 errors from their
servers. To mitigate this problem, instead of raise an exception, we're
going to return empty data for that point
2016-10-26 16:39:35 +02:00
Mario de Frutos
19d6cacdb3 Fix empty response from matrix
https://github.com/CartoDB/dataservices-api/issues/293
2016-10-26 16:00:50 +02:00
Rafa de la Torre
0d22942a72 Update version of python lib server package to 0.9.2 2016-10-21 17:38:32 +02:00
Rafa de la Torre
e8122c6728 Merge pull request #291 from CartoDB/281-quota-mapzen-routing
281 quota mapzen routing
2016-10-21 17:32:39 +02:00
Rafa de la Torre
d4ac2eb5e6 Tests for ServicesRedisConfig (mapzen quota) #281 2016-10-21 15:40:22 +02:00
Rafa de la Torre
db80d389e0 Do not require org geocoder and isolines quota to be present 2016-10-21 15:38:58 +02:00
Rafa de la Torre
8e02c64aeb Tests for UserMetricsService, mapzen routing #281 2016-10-21 15:21:03 +02:00
Rafa de la Torre
cc2ab1bc0c Fix test: make it independent of current date #281 2016-10-21 15:08:17 +02:00
Mario de Frutos
948463f836 Merge pull request #292 from CartoDB/docs-fixed-broken-hyperlink
fixed broken hyperlink
2016-10-20 19:17:16 +02:00
Rafa de la Torre
e1a7d1751c Fix corner case: used_quota == monthly_quota #281 2016-10-20 15:36:48 +02:00
Rafa de la Torre
0c49107f96 Some tests for the QuotaChecker #281 2016-10-20 15:31:21 +02:00
csobier
05dc69af34 fixed broken hyperlink
Fixed one broken hyperlink that broken when the rebranding baseurl was applied.
2016-10-19 15:03:57 -04:00
Rafa de la Torre
247034c21e Some tests for RoutingConfig #281 2016-10-19 18:40:17 +02:00
Rafa de la Torre
2b1b1c981f Add routing to UserMetricsService.used_quota 2016-10-19 17:41:43 +02:00
Rafa de la Torre
aaff5564ec Use soft_limit for __check_routing_quota #281 2016-10-18 17:44:56 +02:00
Rafa de la Torre
72998c324a Simplify the code for regularity's sake #281 2016-10-18 17:40:50 +02:00
Rafa de la Torre
bbd9b6b98e Add soft limit to config object #281 2016-10-18 17:35:36 +02:00
Rafa de la Torre
27be704bd6 Changes needed to read quota from redis #281
Use per-user/org quota from redis, if present. Otherwise default to the
quota stored in server database (as it did to date).
2016-10-18 17:26:45 +02:00
csobier
03f4a1f4f7 Merge pull request #290 from CartoDB/doc_update
Docs issue #1115
2016-10-14 10:48:08 -04:00
csobier
91131488c5 Docs issue #1115
Updated description for IP address
2016-10-14 16:45:23 +02:00
csobier
7d137f3efc Merge pull request #289 from CartoDB/docs-1115-ip-address
Docs issue #1115
2016-10-13 08:54:13 -04:00
csobier
93a5de5f20 Docs issue #1115
Updated description for IP address
2016-10-13 08:38:03 -04:00
Rafa de la Torre
fc35aac639 Merge pull request #288 from CartoDB/development
[server] Refactor of configurations for mapzen geocoder entry point
2016-10-06 15:18:04 +02:00
Rafa de la Torre
98d533b707 Bump version of python pip package to 0.9.1 2016-10-06 15:05:00 +02:00
Rafa de la Torre
6d0ad85d48 Fix for crash when sentinel_master_id is absent in config 2016-10-06 15:04:25 +02:00
Rafa de la Torre
00e6cace76 Update version of the python pip package 2016-10-06 11:06:28 +02:00
Rafa de la Torre
e9ad35ba1d Merge pull request #284 from CartoDB/redis-refactor-take2
Redis refactor: Take 2
2016-10-05 18:32:08 +02:00
Rafa de la Torre
dcb3935021 Upgrade/downgrade files for server extension 2016-10-05 18:28:24 +02:00
Rafa de la Torre
cded6c2f08 Test for NullConfigStorage 2016-10-04 15:20:41 +02:00
Rafa de la Torre
e1b357137a Rename s/storage/backend for consistency 2016-10-04 14:59:57 +02:00
Rafa de la Torre
3844cfc226 Tests for environment module 2016-10-04 14:56:06 +02:00
Rafa de la Torre
2a1276f4f1 Add test for redis_config module 2016-10-04 12:53:52 +02:00
Rafa de la Torre
35da7e48fd Implement a ServerConfigBackendFactory
mostly to keep layers separated.
2016-10-03 17:42:19 +02:00
Rafa de la Torre
12aebb7eee Rename ConfigStorageInterface to ConfigBackendInterface 2016-10-03 17:35:33 +02:00
Rafa de la Torre
0d87a95270 Inject the environment into the LoggerConfigBuilder 2016-10-03 17:06:57 +02:00
Rafa de la Torre
18e1a5c7c9 Remove the class User and its tests
as it is not used.
2016-10-03 16:54:40 +02:00
Rafa de la Torre
fcca5da302 Add a RedisMetricsConnectionFactory class 2016-10-03 16:47:45 +02:00
Rafa de la Torre
1aec541906 Add ServerEnvironment and ServerEnvironmentBuilder 2016-10-03 16:21:28 +02:00
Rafa de la Torre
9e98e0794d Add backends for user and org configs 2016-10-03 16:05:08 +02:00
Rafa de la Torre
8fbb41742c Rename redis conf factories to builders 2016-10-03 15:25:01 +02:00
Rafa de la Torre
275a6dc27f Move redis_config related classes to separate file 2016-10-03 13:33:46 +02:00
Rafa de la Torre
d522083d5c Rename redis_connection_config test file as well 2016-10-03 13:22:47 +02:00
Rafa de la Torre
073163eb1a Rename redis_config.py to redis_connection_config.py 2016-10-03 13:20:30 +02:00
Rafa de la Torre
0c62c4bada Move the NullConfigStorage to a separate file 2016-10-03 13:17:04 +02:00
Rafa de la Torre
3361960cfc Move InMemoryConfigStorage to a separate file 2016-10-03 13:12:49 +02:00
Rafa de la Torre
86ab3abc53 Rename mapzen_geocoder module to mapzen_geocoder_config 2016-10-03 12:58:15 +02:00
Rafa de la Torre
b1f3405cd0 Rename variable s/user_geocoder_config/mapzen_geocoder_config 2016-10-03 12:52:18 +02:00
Rafa de la Torre
fb812ee15e Add tests for redis connection configs 2016-10-03 12:33:38 +02:00
Rafa de la Torre
c1dd410201 Make RedisConnectionConfigBuilder abstract
and also use custom ConfigException instead of generic one.
2016-10-03 12:21:45 +02:00
Rafa de la Torre
34ddd28e6b Move ConfigException to a separate file 2016-10-03 12:20:17 +02:00
Rafa de la Torre
d85bc65bf8 Ignore coverage files 2016-10-03 12:18:47 +02:00
Carla
443fe88d5a Add newline to end fle 2016-09-30 13:08:27 +02:00
Carla Iriberri
6c61626214 Remove RedisMock TODO 2016-09-30 12:29:00 +02:00
Carla Iriberri
74d2fba763 Add Redis Mock -- tested and working 2016-09-30 12:28:27 +02:00
Rafa de la Torre
e24819f193 Take the environment into account 2016-09-30 11:29:44 +02:00
Rafa de la Torre
1e6ee8d5c1 Add an Environment class 2016-09-30 11:21:35 +02:00
Rafa de la Torre
3a6cc4c364 Add mapzen config and integrate into legacy code 2016-09-29 18:41:16 +02:00
Rafa de la Torre
8ad2434b1d Add instantiation of configs to mapzen geocoder 2016-09-29 17:47:36 +02:00
Rafa de la Torre
0b7b44d8a5 Add a couple of factories to abstract user/org configs 2016-09-29 17:46:17 +02:00
Rafa de la Torre
02a2619b45 Fix instantiation of redis metadata connection 2016-09-29 17:19:18 +02:00
Rafa de la Torre
4b4a02905c Implement the RedisConfigStorage 2016-09-29 17:18:45 +02:00
Rafa de la Torre
1f3a655ae5 Add all the code to instantiate a user_config_storage (WIP) 2016-09-29 16:57:31 +02:00
Rafa de la Torre
9d60fde0b8 A stubb for RedisConfigStorage (WIP) 2016-09-29 13:07:40 +02:00
Rafa de la Torre
efdc151282 Add things to get logger config from storage 2016-09-28 18:22:51 +02:00
Rafa de la Torre
fd2cc21942 Add server config storage classes 2016-09-28 17:32:26 +02:00
Rafa de la Torre
18f05fbd4f User class with tests 2016-09-28 16:30:37 +02:00
Carla
d2f4586bae Merge pull request #283 from CartoDB/master
Getting master back to develop: Update NEWS.md for new Python release
2016-09-28 15:53:59 +02:00
Carla
54eb279ae8 Update NEWS.md 2016-09-28 15:40:25 +02:00
Carla
85d6c2a54e Update NEWS.md 2016-09-28 15:38:48 +02:00
Carla
cad2051efe Merge pull request #282 from CartoDB/development
Release Python server package version 0.8.1
2016-09-28 15:37:58 +02:00
Carla Iriberri
96a93e3c56 Python package version 0.8.1 2016-09-28 15:09:29 +02:00
Rafa de la Torre
facda9e8be Merge pull request #275 from CartoDB/fix_qps_retry
Fixed QPS retry decorator
2016-09-28 13:47:45 +02:00
Rafa de la Torre
64fc18b9e0 Fix and improve test speed 2016-09-28 13:43:50 +02:00
Mario de Frutos
9381d5644b Fixed QPS retry decorator 2016-09-16 13:46:02 +02:00
Carla
9f55f2ee3b Update README.md 2016-09-09 11:44:57 +02:00
Carla
1087c1266b Merge pull request #272 from CartoDB/development
Release client 0.11.1
2016-09-08 12:42:26 +02:00
Carla
d5a296a30c Update NEWS.md 2016-09-08 10:39:48 +02:00
Carla Iriberri
f8caf4314d Release client 0.11.1 2016-09-07 17:52:53 +02:00
Carla
d7910fbbf1 Merge pull request #271 from CartoDB/new_framework_obs_compat
Use real function name for compatibility with observatory
2016-09-07 17:45:32 +02:00
Carla
d47049c813 Escape characters in example 2016-09-07 16:01:45 +02:00
Carla
cc8f93c535 Use real function name for compatibility 2016-09-07 15:40:20 +02:00
Carla
3f9441de7e Merge pull request #269 from CartoDB/development
New release client 0.11.0 and server  0.15.2
2016-09-02 12:20:24 +02:00
Carla
fe41359a1f Merge pull request #268 from CartoDB/release_obs_measure
Release DST OBS_Measure
2016-09-02 11:31:06 +02:00
Carla Iriberri
46a934b178 Client 0.11.0 2016-09-01 18:07:44 +02:00
Carla Iriberri
184358bdec Server 0.15.1 2016-09-01 18:06:56 +02:00
Carla Iriberri
a6d546f2ee Bump version client 0.11.0 2016-09-01 17:42:57 +02:00
Carla
fc99f7aba9 Merge pull request #267 from CartoDB/augment_revamp_analysis
Table-level OBS_GetMeasue revamp
2016-09-01 17:21:56 +02:00
Carla Iriberri
e959873b32 Check server return 2016-09-01 16:37:54 +02:00
Mario de Frutos
a98093540d Fix table level tests for server 2016-09-01 15:53:19 +02:00
Carla Iriberri
78add220cd New tests for restructuration - client 2016-09-01 15:46:23 +02:00
Carla Iriberri
cf2f86136b Update signatures in functions 2016-09-01 15:37:08 +02:00
Carla Iriberri
fb183b07ee Update grants for new signatures 2016-09-01 15:34:57 +02:00
Carla Iriberri
5ab727bcb6 Remove warnign and fix server_conn_str path 2016-09-01 14:36:12 +02:00
Carla Iriberri
1e9b551160 Add populate table function and several fixes 2016-08-30 16:38:44 +02:00
Mario de Frutos
699dc9bf0e Merge pull request #266 from CartoDB/development
Release 0.15.0
2016-08-30 12:16:20 +02:00
Carla Iriberri
fc291a7c63 First steps 2016-08-29 18:07:58 +02:00
Mario de Frutos
d73af32c2c Version 0.15.0 relase artifacts 2016-08-29 17:12:00 +02:00
Mario de Frutos
7ea88fa051 Merge pull request #265 from CartoDB/221_mapzen_for_city_names
Mapzen to geocode city names
2016-08-29 14:41:48 +02:00
Mario de Frutos
d2ca40cf38 Removed observatory unnecessary dependency in tests 2016-08-29 14:38:38 +02:00
Mario de Frutos
18ae2525b6 Add search type filter for Mapzen geocoder 2016-08-29 14:32:49 +02:00
Mario de Frutos
06462fdf7a Control file for version 0.15 and empty upgrade/downgrade 2016-08-29 14:04:08 +02:00
Mario de Frutos
71d5ce951a Use mapzen as first option for the namedplace geocoding 2016-08-26 16:39:25 +02:00
96 changed files with 11437 additions and 408 deletions

1
.gitignore vendored
View File

@@ -1,5 +1,6 @@
.DS_Store
*.pyc
.coverage
cartodb_services.egg-info/
build/
dist/

30
NEWS.md
View File

@@ -1,3 +1,33 @@
October 21st, 2016
==================
* Version 0.9.2 of the python package
* mapzen routing quota now is configurable per user
September 28, 2016
==========
* Released version 0.8.1 of Python package cartodb\_services
* Improvements in QPS retry decorator for requests to external services
https://github.com/CartoDB/dataservices-api/releases/tag/python-0.8.1
September 8, 2016
===========
* Released version 0.11.1 of the client
* Minor change in the name of the function parameter sent to server and Observatory backend for compatibility with the last observatory-extension framework updates
September 1, 2016
===========
* Released version 0.11.0 of the client
* Include DS table functions to create and populate a table with the GetMeasure function in observatory
* Released version 0.15.1 of the server
* Rename DS table functions
August 29, 2016
===========
* Released version 0.15.0 of the server
* Geocode namedplace point functions uses Mapzen search service and in case of error
it'll use the internal geocoder
August 19, 2016
===========
* Released version 0.7.4.2 of the server python library

View File

@@ -5,7 +5,7 @@ The CARTO Data Services SQL API
Steps to deploy a new Data Services API version :
- Deploy new version of dataservices API to all servers
- Update the server user using: ALTER EXTENSION cdb_dataservices_server UPDATE TO '<CURRENT_VERSION>';
- Update the server user using: ALTER EXTENSION cdb_dataservices_server UPDATE TO '\<CURRENT_VERSION\>';
- Update the python dependencies if needed: **cartodb_geocoder** and **heremaps**
- Add the needed config in the `cdb_conf` table:
- `redis_metadata_config` and `redis_metrics_conf`

View File

@@ -13,8 +13,8 @@ OLD_VERSIONS = $(wildcard old_versions/*.sql)
# @see http://www.postgresql.org/docs/current/static/extend-pgxs.html
DATA = $(NEW_EXTENSION_ARTIFACT) \
$(OLD_VERSIONS) \
cdb_dataservices_client--0.10.1--0.10.2.sql \
cdb_dataservices_client--0.10.2--0.10.1.sql
cdb_dataservices_client--0.11.0--0.11.1.sql \
cdb_dataservices_client--0.11.1--0.11.0.sql
REGRESS = $(notdir $(basename $(wildcard test/sql/*test.sql)))

View File

@@ -0,0 +1,140 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_client UPDATE TO '0.11.1'" to load this file. \quit
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'OBS_GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)
)
)
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Create a new table with the required columns
plpy.execute('CREATE TABLE "{schema}".{table_name} ( '
'cartodb_id int, the_geom geometry, {columns_with_types} '
');'
.format(schema=user_schema, table_name=output_table_name, columns_with_types=columns_with_types)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'OBS_GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute(
"SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)))
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [
colnames_arr[i] +
' ' +
coltypes_arr[i] for i in range(
0,
len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
aliased_colname_list = ','.join(
['result.' + name for name in colnames_arr])
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute(
"SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, "
"{schema}::text, {dbname}::text, {table_name}::text);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
user_db_role=plpy.quote_literal(user_db_role),
schema=plpy.quote_literal(user_schema),
dbname=plpy.quote_literal(dbname),
table_name=plpy.quote_literal(table_name)))
if ds_fdw_metadata[0]["schemaname"]:
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
else:
raise Exception('Error connecting dataset via FDW')
# Create a new table with the required columns
plpy.execute(
'INSERT INTO "{schema}".{analysis_table_name} '
'SELECT ut.cartodb_id, ut.the_geom, {colname_list} '
'FROM "{schema}".{table_name} ut '
'LEFT JOIN _DST_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, '
'{function_name}::text, {params}::json) '
'AS result ({columns_with_types}, cartodb_id int) '
'ON result.cartodb_id = ut.cartodb_id;' .format(
schema=user_schema,
analysis_table_name=output_table_name,
colname_list=aliased_colname_list,
table_name=table_name,
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params),
columns_with_types=columns_with_types))
# Wipe user FDW data from the server
wiped = plpy.execute(
"SELECT cdb_dataservices_client._DST_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, "
"{server_table_name}::text, {fdw_server}::text)" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
fdw_server=plpy.quote_literal(server_name)))
return True
$$ LANGUAGE plpythonu;

View File

@@ -0,0 +1,140 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_client UPDATE TO '0.11.0'" to load this file. \quit
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)
)
)
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Create a new table with the required columns
plpy.execute('CREATE TABLE "{schema}".{table_name} ( '
'cartodb_id int, the_geom geometry, {columns_with_types} '
');'
.format(schema=user_schema, table_name=output_table_name, columns_with_types=columns_with_types)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute(
"SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)))
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [
colnames_arr[i] +
' ' +
coltypes_arr[i] for i in range(
0,
len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
aliased_colname_list = ','.join(
['result.' + name for name in colnames_arr])
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute(
"SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, "
"{schema}::text, {dbname}::text, {table_name}::text);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
user_db_role=plpy.quote_literal(user_db_role),
schema=plpy.quote_literal(user_schema),
dbname=plpy.quote_literal(dbname),
table_name=plpy.quote_literal(table_name)))
if ds_fdw_metadata[0]["schemaname"]:
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
else:
raise Exception('Error connecting dataset via FDW')
# Create a new table with the required columns
plpy.execute(
'INSERT INTO "{schema}".{analysis_table_name} '
'SELECT ut.cartodb_id, ut.the_geom, {colname_list} '
'FROM "{schema}".{table_name} ut '
'LEFT JOIN _DST_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, '
'{function_name}::text, {params}::json) '
'AS result ({columns_with_types}, cartodb_id int) '
'ON result.cartodb_id = ut.cartodb_id;' .format(
schema=user_schema,
analysis_table_name=output_table_name,
colname_list=aliased_colname_list,
table_name=table_name,
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params),
columns_with_types=columns_with_types))
# Wipe user FDW data from the server
wiped = plpy.execute(
"SELECT cdb_dataservices_client._DST_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, "
"{server_table_name}::text, {fdw_server}::text)" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
fdw_server=plpy.quote_literal(server_name)))
return True
$$ LANGUAGE plpythonu;

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
comment = 'CartoDB dataservices client API extension'
default_version = '0.10.2'
default_version = '0.11.1'
requires = 'plproxy, cartodb'
superuser = true
schema = cdb_dataservices_client

View File

@@ -0,0 +1,289 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_client UPDATE TO '0.11.0'" to load this file. \quit
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_GetTable(text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_AugmentTable(text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client.__OBS_AugmentTable(text, text, text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client.__OBS_GetTable(text, text, text, text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_ConnectUserTable(text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_GetReturnMetadata(text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_FetchJoinFdwTableData(text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._OBS_DisconnectUserTable(text, text, text, text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure(
output_table_name text,
params json
) RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username,
orgname,
user_db_role,
user_schema,
output_table_name,
params
) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure(
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
dbname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT current_database() INTO dbname;
SELECT cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username,
orgname,
user_db_role,
user_schema,
dbname,
table_name,
output_table_name,
params
) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)
)
)
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Create a new table with the required columns
plpy.execute('CREATE TABLE "{schema}".{table_name} ( '
'cartodb_id int, the_geom geometry, {columns_with_types} '
');'
.format(schema=user_schema, table_name=output_table_name, columns_with_types=columns_with_types)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute(
"SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)))
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [
colnames_arr[i] +
' ' +
coltypes_arr[i] for i in range(
0,
len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
aliased_colname_list = ','.join(
['result.' + name for name in colnames_arr])
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute(
"SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, "
"{schema}::text, {dbname}::text, {table_name}::text);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
user_db_role=plpy.quote_literal(user_db_role),
schema=plpy.quote_literal(user_schema),
dbname=plpy.quote_literal(dbname),
table_name=plpy.quote_literal(table_name)))
if ds_fdw_metadata[0]["schemaname"]:
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
else:
raise Exception('Error connecting dataset via FDW')
# Create a new table with the required columns
plpy.execute(
'INSERT INTO "{schema}".{analysis_table_name} '
'SELECT ut.cartodb_id, ut.the_geom, {colname_list} '
'FROM "{schema}".{table_name} ut '
'LEFT JOIN _DST_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, '
'{function_name}::text, {params}::json) '
'AS result ({columns_with_types}, cartodb_id int) '
'ON result.cartodb_id = ut.cartodb_id;' .format(
schema=user_schema,
analysis_table_name=output_table_name,
colname_list=aliased_colname_list,
table_name=table_name,
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params),
columns_with_types=columns_with_types))
# Wipe user FDW data from the server
wiped = plpy.execute(
"SELECT cdb_dataservices_client._DST_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, "
"{server_table_name}::text, {fdw_server}::text)" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
fdw_server=plpy.quote_literal(server_name)))
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_ConnectUserTable(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text
)RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_GetReturnMetadata(
username text,
orgname text,
function_name text,
params json
) RETURNS cdb_dataservices_client.ds_return_metadata AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_FetchJoinFdwTableData(
username text,
orgname text,
table_schema text,
table_name text,
function_name text,
params json
) RETURNS SETOF record AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_DisconnectUserTable(
username text,
orgname text,
table_schema text,
table_name text,
server_name text
) RETURNS boolean AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_DisconnectUserTable;
$$ LANGUAGE plproxy;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure(output_table_name text, params json) TO publicuser;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure(table_name text, output_table_name text, params json) TO publicuser;

View File

@@ -0,0 +1,281 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_client UPDATE TO '0.10.2'" to load this file. \quit
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure(text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure(text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(text, text, text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_ConnectUserTable(text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_GetReturnMetadata(text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_FetchJoinFdwTableData(text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_client._DST_DisconnectUserTable(text, text, text, text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_GetTable(table_name text, output_table_name text, function_name text, params json)
RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
dbname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT current_database() INTO dbname;
SELECT cdb_dataservices_client.__OBS_GetTable(username, orgname, user_db_role, user_schema, dbname, table_name, output_table_name, function_name, params) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_AugmentTable(table_name text, function_name text, params json)
RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
dbname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT current_database() INTO dbname;
SELECT cdb_dataservices_client.__OBS_AugmentTable(username, orgname, user_db_role, user_schema, dbname, table_name, function_name, params) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__OBS_AugmentTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text, function_name text, params json)
RETURNS boolean AS $$
from time import strftime
try:
server_table_name = None
temporary_table_name = 'ds_tmp_' + str(strftime("%s")) + table_name
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._OBS_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params))
)
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
# Prepare column and type strings required in the SQL queries
colnames = ','.join(colnames_arr)
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute("SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {user_schema}::text, {dbname}::text, {table_name}::text);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), user_schema=plpy.quote_literal(user_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name))
)
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
# Create temporary table with the augmented results
plpy.execute('CREATE UNLOGGED TABLE "{user_schema}".{temp_table_name} AS '
'(SELECT {columns}, cartodb_id '
'FROM cdb_dataservices_client._OBS_FetchJoinFdwTableData('
'{username}::text, {orgname}::text, {schema}::text, {table_name}::text, {function_name}::text, {params}::json) '
'AS results({columns_with_types}, cartodb_id int) )'
.format(columns=colnames, username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname),
user_schema=user_schema, schema=plpy.quote_literal(server_schema), table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params), columns_with_types=columns_with_types,
temp_table_name=temporary_table_name)
)
# Wipe user FDW data from the server
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
# Add index to cartodb_id
plpy.execute('CREATE UNIQUE INDEX {temp_table_name}_pkey ON "{user_schema}".{temp_table_name} (cartodb_id)'
.format(user_schema=user_schema, temp_table_name=temporary_table_name)
)
# Prepare table to receive augmented results in new columns
for idx, column in enumerate(colnames_arr):
if colnames_arr[idx] is not 'the_geom':
plpy.execute('ALTER TABLE "{user_schema}".{table_name} ADD COLUMN {column_name} {column_type}'
.format(user_schema=user_schema, table_name=table_name, column_name=colnames_arr[idx], column_type=coltypes_arr[idx])
)
# Populate the user table with the augmented results
plpy.execute('UPDATE "{user_schema}".{table_name} SET {columns} = '
'(SELECT {columns} FROM "{user_schema}".{temporary_table_name} '
'WHERE "{user_schema}".{temporary_table_name}.cartodb_id = "{user_schema}".{table_name}.cartodb_id)'
.format(columns = colnames, username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname),
user_schema = user_schema, table_name=table_name, function_name=function_name, params=params, columns_with_types=columns_with_types,
temporary_table_name=temporary_table_name)
)
plpy.execute('DROP TABLE IF EXISTS "{user_schema}".{temporary_table_name}'
.format(user_schema=user_schema, table_name=table_name, temporary_table_name=temporary_table_name)
)
return True
except Exception as e:
plpy.warning('Error trying to augment table {0}'.format(e))
# Wipe user FDW data from the server in case of failure if the table was connected
if server_table_name:
# Wipe local temporary table
plpy.execute('DROP TABLE IF EXISTS "{user_schema}".{temporary_table_name}'
.format(user_schema=user_schema, table_name=table_name, temporary_table_name=temporary_table_name)
)
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return False
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__OBS_GetTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text, output_table_name text, function_name text, params json)
RETURNS boolean AS $$
try:
server_table_name = None
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._OBS_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params))
)
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
# Prepare column and type strings required in the SQL queries
colnames = ','.join(colnames_arr)
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute("SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {table_name}::text);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), schema=plpy.quote_literal(user_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name))
)
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
# Get list of user columns to include in the new table
user_table_columns = ','.join(
plpy.execute('SELECT array_agg(\'user_table.\' || attname) AS columns '
'FROM pg_attribute WHERE attrelid = \'"{user_schema}".{table_name}\'::regclass '
'AND attnum > 0 AND NOT attisdropped AND attname NOT LIKE \'the_geom_webmercator\' '
'AND NOT attname LIKE ANY(string_to_array(\'{colnames}\',\',\'));'
.format(user_schema=user_schema, table_name=table_name, colnames=colnames)
)[0]["columns"]
)
# Populate a new table with the augmented results
plpy.execute('CREATE TABLE "{user_schema}".{output_table_name} AS '
'(SELECT results.{columns}, {user_table_columns} '
'FROM {table_name} AS user_table '
'LEFT JOIN cdb_dataservices_client._OBS_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {function_name}::text, {params}::json) as results({columns_with_types}, cartodb_id int) '
'ON results.cartodb_id = user_table.cartodb_id)'
.format(output_table_name=output_table_name, columns=colnames, user_table_columns=user_table_columns, username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname), user_schema=user_schema, server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name),
table_name=table_name, function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params), columns_with_types=columns_with_types)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
# Wipe user FDW data from the server
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return True
except Exception as e:
plpy.warning('Error trying to get table {0}'.format(e))
# Wipe user FDW data from the server in case of failure if the table was connected
if server_table_name:
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return False
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_ConnectUserTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_client.ds_return_metadata AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS SETOF record AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, server_name text)
RETURNS boolean AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_DisconnectUserTable;
$$ LANGUAGE plproxy;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._obs_augmenttable(table_name text, function_name text, params json) TO publicuser;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._obs_gettable(table_name text, output_table_name text, function_name text, params json) TO publicuser;

File diff suppressed because it is too large Load Diff

View File

@@ -33,7 +33,6 @@
- { name: admin1_name, type: text}
- { name: country_name, type: text}
- name: cdb_geocode_postalcode_polygon
return_type: Geometry
params:

View File

@@ -1,8 +1,53 @@
CREATE TYPE cdb_dataservices_client.ds_fdw_metadata as (schemaname text, tabname text, servername text);
CREATE TYPE cdb_dataservices_client.ds_return_metadata as (colnames text[], coltypes text[]);
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_GetTable(table_name text, output_table_name text, function_name text, params json)
RETURNS boolean AS $$
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure(
output_table_name text,
params json
) RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username,
orgname,
user_db_role,
user_schema,
output_table_name,
params
) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure(
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
@@ -31,238 +76,200 @@ BEGIN
SELECT current_database() INTO dbname;
SELECT cdb_dataservices_client.__OBS_GetTable(username, orgname, user_db_role, user_schema, dbname, table_name, output_table_name, function_name, params) INTO result;
SELECT cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username,
orgname,
user_db_role,
user_schema,
dbname,
table_name,
output_table_name,
params
) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_AugmentTable(table_name text, function_name text, params json)
RETURNS boolean AS $$
DECLARE
username text;
user_db_role text;
orgname text;
dbname text;
user_schema text;
result boolean;
BEGIN
IF session_user = 'publicuser' OR session_user ~ 'cartodb_publicuser_*' THEN
RAISE EXCEPTION 'The api_key must be provided';
END IF;
SELECT session_user INTO user_db_role;
SELECT u, o INTO username, orgname FROM cdb_dataservices_client._cdb_entity_config() AS (u text, o text);
-- JSON value stored "" is taken as literal
IF username IS NULL OR username = '' OR username = '""' THEN
RAISE EXCEPTION 'Username is a mandatory argument';
END IF;
IF orgname IS NULL OR orgname = '' OR orgname = '""' THEN
user_schema := 'public';
ELSE
user_schema := username;
END IF;
SELECT current_database() INTO dbname;
SELECT cdb_dataservices_client.__OBS_AugmentTable(username, orgname, user_db_role, user_schema, dbname, table_name, function_name, params) INTO result;
RETURN result;
END;
$$ LANGUAGE 'plpgsql' SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__OBS_AugmentTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text, function_name text, params json)
RETURNS boolean AS $$
from time import strftime
try:
server_table_name = None
temporary_table_name = 'ds_tmp_' + str(strftime("%s")) + table_name
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._OBS_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params))
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PrepareTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'OBS_GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)
)
)
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
# Prepare column and type strings required in the SQL queries
colnames = ','.join(colnames_arr)
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute("SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {user_schema}::text, {dbname}::text, {table_name}::text);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), user_schema=plpy.quote_literal(user_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name))
)
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
# Create temporary table with the augmented results
plpy.execute('CREATE UNLOGGED TABLE "{user_schema}".{temp_table_name} AS '
'(SELECT {columns}, cartodb_id '
'FROM cdb_dataservices_client._OBS_FetchJoinFdwTableData('
'{username}::text, {orgname}::text, {schema}::text, {table_name}::text, {function_name}::text, {params}::json) '
'AS results({columns_with_types}, cartodb_id int) )'
.format(columns=colnames, username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname),
user_schema=user_schema, schema=plpy.quote_literal(server_schema), table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params), columns_with_types=columns_with_types,
temp_table_name=temporary_table_name)
)
# Wipe user FDW data from the server
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
# Add index to cartodb_id
plpy.execute('CREATE UNIQUE INDEX {temp_table_name}_pkey ON "{user_schema}".{temp_table_name} (cartodb_id)'
.format(user_schema=user_schema, temp_table_name=temporary_table_name)
)
# Prepare table to receive augmented results in new columns
for idx, column in enumerate(colnames_arr):
if colnames_arr[idx] is not 'the_geom':
plpy.execute('ALTER TABLE "{user_schema}".{table_name} ADD COLUMN {column_name} {column_type}'
.format(user_schema=user_schema, table_name=table_name, column_name=colnames_arr[idx], column_type=coltypes_arr[idx])
)
# Populate the user table with the augmented results
plpy.execute('UPDATE "{user_schema}".{table_name} SET {columns} = '
'(SELECT {columns} FROM "{user_schema}".{temporary_table_name} '
'WHERE "{user_schema}".{temporary_table_name}.cartodb_id = "{user_schema}".{table_name}.cartodb_id)'
.format(columns = colnames, username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname),
user_schema = user_schema, table_name=table_name, function_name=function_name, params=params, columns_with_types=columns_with_types,
temporary_table_name=temporary_table_name)
)
plpy.execute('DROP TABLE IF EXISTS "{user_schema}".{temporary_table_name}'
.format(user_schema=user_schema, table_name=table_name, temporary_table_name=temporary_table_name)
)
return True
except Exception as e:
plpy.warning('Error trying to augment table {0}'.format(e))
# Wipe user FDW data from the server in case of failure if the table was connected
if server_table_name:
# Wipe local temporary table
plpy.execute('DROP TABLE IF EXISTS "{user_schema}".{temporary_table_name}'
.format(user_schema=user_schema, table_name=table_name, temporary_table_name=temporary_table_name)
)
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return False
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__OBS_GetTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text, output_table_name text, function_name text, params json)
RETURNS boolean AS $$
try:
server_table_name = None
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute("SELECT colnames, coltypes "
"FROM cdb_dataservices_client._OBS_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params))
)
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
# Prepare column and type strings required in the SQL queries
colnames = ','.join(colnames_arr)
columns_with_types_arr = [colnames_arr[i] + ' ' + coltypes_arr[i] for i in range(0,len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute("SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {table_name}::text);"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), schema=plpy.quote_literal(user_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name))
)
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
# Get list of user columns to include in the new table
user_table_columns = ','.join(
plpy.execute('SELECT array_agg(\'user_table.\' || attname) AS columns '
'FROM pg_attribute WHERE attrelid = \'"{user_schema}".{table_name}\'::regclass '
'AND attnum > 0 AND NOT attisdropped AND attname NOT LIKE \'the_geom_webmercator\' '
'AND NOT attname LIKE ANY(string_to_array(\'{colnames}\',\',\'));'
.format(user_schema=user_schema, table_name=table_name, colnames=colnames)
)[0]["columns"]
# Create a new table with the required columns
plpy.execute('CREATE TABLE "{schema}".{table_name} ( '
'cartodb_id int, the_geom geometry, {columns_with_types} '
');'
.format(schema=user_schema, table_name=output_table_name, columns_with_types=columns_with_types)
)
# Populate a new table with the augmented results
plpy.execute('CREATE TABLE "{user_schema}".{output_table_name} AS '
'(SELECT results.{columns}, {user_table_columns} '
'FROM {table_name} AS user_table '
'LEFT JOIN cdb_dataservices_client._OBS_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {function_name}::text, {params}::json) as results({columns_with_types}, cartodb_id int) '
'ON results.cartodb_id = user_table.cartodb_id)'
.format(output_table_name=output_table_name, columns=colnames, user_table_columns=user_table_columns, username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname), user_schema=user_schema, server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name),
table_name=table_name, function_name=plpy.quote_literal(function_name), params=plpy.quote_literal(params), columns_with_types=columns_with_types)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
plpy.execute('ALTER TABLE "{schema}".{table_name} OWNER TO "{user}";'
.format(schema=user_schema, table_name=output_table_name, user=user_db_role)
)
# Wipe user FDW data from the server
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return True
except Exception as e:
plpy.warning('Error trying to get table {0}'.format(e))
# Wipe user FDW data from the server in case of failure if the table was connected
if server_table_name:
wiped = plpy.execute("SELECT cdb_dataservices_client._OBS_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, {fdw_server}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), server_schema=plpy.quote_literal(server_schema), server_table_name=plpy.quote_literal(server_table_name), fdw_server=plpy.quote_literal(server_name))
)
return False
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client.__DST_PopulateTableOBS_GetMeasure(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text,
output_table_name text,
params json
) RETURNS boolean AS $$
function_name = 'OBS_GetMeasure'
# Obtain return types for augmentation procedure
ds_return_metadata = plpy.execute(
"SELECT colnames, coltypes "
"FROM cdb_dataservices_client._DST_GetReturnMetadata({username}::text, {orgname}::text, {function_name}::text, {params}::json);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params)))
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_ConnectUserTable(username text, orgname text, user_db_role text, user_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_ConnectUserTable;
if ds_return_metadata[0]["colnames"]:
colnames_arr = ds_return_metadata[0]["colnames"]
coltypes_arr = ds_return_metadata[0]["coltypes"]
else:
raise Exception('Error retrieving OBS_GetMeasure metadata')
# Prepare column and type strings required in the SQL queries
columns_with_types_arr = [
colnames_arr[i] +
' ' +
coltypes_arr[i] for i in range(
0,
len(colnames_arr))]
columns_with_types = ','.join(columns_with_types_arr)
aliased_colname_list = ','.join(
['result.' + name for name in colnames_arr])
# Instruct the OBS server side to establish a FDW
# The metadata is obtained as well in order to:
# - (a) be able to write the query to grab the actual data to be executed in the remote server via pl/proxy,
# - (b) be able to tell OBS to free resources when done.
ds_fdw_metadata = plpy.execute(
"SELECT schemaname, tabname, servername "
"FROM cdb_dataservices_client._DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, "
"{schema}::text, {dbname}::text, {table_name}::text);" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
user_db_role=plpy.quote_literal(user_db_role),
schema=plpy.quote_literal(user_schema),
dbname=plpy.quote_literal(dbname),
table_name=plpy.quote_literal(table_name)))
if ds_fdw_metadata[0]["schemaname"]:
server_schema = ds_fdw_metadata[0]["schemaname"]
server_table_name = ds_fdw_metadata[0]["tabname"]
server_name = ds_fdw_metadata[0]["servername"]
else:
raise Exception('Error connecting dataset via FDW')
# Create a new table with the required columns
plpy.execute(
'INSERT INTO "{schema}".{analysis_table_name} '
'SELECT ut.cartodb_id, ut.the_geom, {colname_list} '
'FROM "{schema}".{table_name} ut '
'LEFT JOIN _DST_FetchJoinFdwTableData({username}::text, {orgname}::text, {server_schema}::text, {server_table_name}::text, '
'{function_name}::text, {params}::json) '
'AS result ({columns_with_types}, cartodb_id int) '
'ON result.cartodb_id = ut.cartodb_id;' .format(
schema=user_schema,
analysis_table_name=output_table_name,
colname_list=aliased_colname_list,
table_name=table_name,
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
function_name=plpy.quote_literal(function_name),
params=plpy.quote_literal(params),
columns_with_types=columns_with_types))
# Wipe user FDW data from the server
wiped = plpy.execute(
"SELECT cdb_dataservices_client._DST_DisconnectUserTable({username}::text, {orgname}::text, {server_schema}::text, "
"{server_table_name}::text, {fdw_server}::text)" .format(
username=plpy.quote_nullable(username),
orgname=plpy.quote_nullable(orgname),
server_schema=plpy.quote_literal(server_schema),
server_table_name=plpy.quote_literal(server_table_name),
fdw_server=plpy.quote_literal(server_name)))
return True
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_ConnectUserTable(
username text,
orgname text,
user_db_role text,
user_schema text,
dbname text,
table_name text
)RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_client.ds_return_metadata AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_GetReturnMetadata;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_GetReturnMetadata(
username text,
orgname text,
function_name text,
params json
) RETURNS cdb_dataservices_client.ds_return_metadata AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS SETOF record AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_FetchJoinFdwTableData;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_FetchJoinFdwTableData(
username text,
orgname text,
table_schema text,
table_name text,
function_name text,
params json
) RETURNS SETOF record AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, server_name text)
RETURNS boolean AS $$
CONNECT _server_conn_str();
TARGET cdb_dataservices_server._OBS_DisconnectUserTable;
CREATE OR REPLACE FUNCTION cdb_dataservices_client._DST_DisconnectUserTable(
username text,
orgname text,
table_schema text,
table_name text,
server_name text
) RETURNS boolean AS $$
CONNECT cdb_dataservices_client._server_conn_str();
TARGET cdb_dataservices_server._DST_DisconnectUserTable;
$$ LANGUAGE plproxy;

View File

@@ -1,2 +1,2 @@
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._obs_augmenttable(table_name text, function_name text, params json) TO publicuser;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._obs_gettable(table_name text, output_table_name text, function_name text, params json) TO publicuser;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure(output_table_name text, params json) TO publicuser;
GRANT EXECUTE ON FUNCTION cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure(table_name text, output_table_name text, params json) TO publicuser;

View File

@@ -3,65 +3,63 @@ SET search_path TO public,cartodb,cdb_dataservices_client;
CREATE TABLE my_table(cartodb_id int);
INSERT INTO my_table (cartodb_id) VALUES (1);
-- Mock the server functions
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
BEGIN
RETURN ('dummy_schema'::text, 'dummy_table'::text, 'dummy_server'::text);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_client.ds_return_metadata AS $$
BEGIN
RETURN (Array['total_pop'], Array['double precision']);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS RECORD AS $$
BEGIN
RETURN (23.4::double precision, 1::int);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
RETURNS boolean AS $$
BEGIN
RETURN true;
END;
$$ LANGUAGE 'plpgsql';
-- Augment a table with the total_pop column
SELECT cdb_dataservices_client._OBS_AugmentTable('my_table', 'dummy', '{"dummy":"dummy"}'::json);
_obs_augmenttable
-------------------
-- Create a sample user table
CREATE TABLE user_table (cartodb_id int, the_geom geometry);
INSERT INTO user_table(cartodb_id, the_geom) VALUES (1, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
INSERT INTO user_table(cartodb_id, the_geom) VALUES (2, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
INSERT INTO user_table(cartodb_id, the_geom) VALUES (3, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
-- Prepare a table with the total_pop column
SELECT cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure('my_table_dst', '{"dummy":"dummy"}'::json);
_dst_preparetableobs_getmeasure
---------------------------------
t
(1 row)
-- The results of the table should return the mocked value of 23.4 in the total_pop column
SELECT * FROM my_table;
cartodb_id | total_pop
------------+-----------
1 | 23.4
(1 row)
-- The table should now exist and be empty
SELECT * FROM my_table_dst;
cartodb_id | the_geom | total_pop
------------+----------+-----------
(0 rows)
-- Mock again the function for it to return a different value now
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS RECORD AS $$
BEGIN
RETURN (577777.4::double precision, 1::int);
END;
$$ LANGUAGE 'plpgsql';
-- Augment a new table with total_pop
SELECT cdb_dataservices_client._OBS_GetTable('my_table', 'my_table_new', 'dummy', '{"dummy":"dummy"}'::json);
_obs_gettable
---------------
-- Populate the table with measurement data
SELECT cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure('user_table', 'my_table_dst', '{"dummy":"dummy"}'::json);
_dst_populatetableobs_getmeasure
----------------------------------
t
(1 row)
-- Check that the table contains the new value for total_pop and not the value already existent in the table
SELECT * FROM my_table_new;
total_pop | cartodb_id
-----------+------------
577777.4 | 1
(1 row)
-- The table should now show the results
SELECT * FROM my_table_dst;
cartodb_id | the_geom | total_pop
------------+----------------------------------------------------+-----------
1 | 0101000020E6100000F74FC902E07D52C05FE24CC7654B4440 | 23.4
2 | 0101000020E6100000F74FC902E07D52C05FE24CC7654B4440 |
3 | 0101000020E6100000F74FC902E07D52C05FE24CC7654B4440 |
(3 rows)
-- Clean tables
DROP TABLE my_table;
DROP TABLE my_table_new;
DROP TABLE my_table_dst;

View File

@@ -6,54 +6,51 @@ CREATE TABLE my_table(cartodb_id int);
INSERT INTO my_table (cartodb_id) VALUES (1);
-- Mock the server functions
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_client.ds_fdw_metadata AS $$
BEGIN
RETURN ('dummy_schema'::text, 'dummy_table'::text, 'dummy_server'::text);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_client.ds_return_metadata AS $$
BEGIN
RETURN (Array['total_pop'], Array['double precision']);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS RECORD AS $$
BEGIN
RETURN (23.4::double precision, 1::int);
END;
$$ LANGUAGE 'plpgsql';
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
RETURNS boolean AS $$
BEGIN
RETURN true;
END;
$$ LANGUAGE 'plpgsql';
-- Augment a table with the total_pop column
SELECT cdb_dataservices_client._OBS_AugmentTable('my_table', 'dummy', '{"dummy":"dummy"}'::json);
-- Create a sample user table
CREATE TABLE user_table (cartodb_id int, the_geom geometry);
INSERT INTO user_table(cartodb_id, the_geom) VALUES (1, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
INSERT INTO user_table(cartodb_id, the_geom) VALUES (2, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
INSERT INTO user_table(cartodb_id, the_geom) VALUES (3, '0101000020E6100000F74FC902E07D52C05FE24CC7654B4440');
-- The results of the table should return the mocked value of 23.4 in the total_pop column
SELECT * FROM my_table;
-- Prepare a table with the total_pop column
SELECT cdb_dataservices_client._DST_PrepareTableOBS_GetMeasure('my_table_dst', '{"dummy":"dummy"}'::json);
-- Mock again the function for it to return a different value now
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS RECORD AS $$
BEGIN
RETURN (577777.4::double precision, 1::int);
END;
$$ LANGUAGE 'plpgsql';
-- The table should now exist and be empty
SELECT * FROM my_table_dst;
-- Augment a new table with total_pop
SELECT cdb_dataservices_client._OBS_GetTable('my_table', 'my_table_new', 'dummy', '{"dummy":"dummy"}'::json);
-- Populate the table with measurement data
SELECT cdb_dataservices_client._DST_PopulateTableOBS_GetMeasure('user_table', 'my_table_dst', '{"dummy":"dummy"}'::json);
-- Check that the table contains the new value for total_pop and not the value already existent in the table
SELECT * FROM my_table_new;
-- The table should now show the results
SELECT * FROM my_table_dst;
-- Clean tables
DROP TABLE my_table;
DROP TABLE my_table_new;
DROP TABLE my_table_dst;

View File

@@ -1,6 +1,6 @@
# Demographic Functions
The Demographic Snapshot enables you to collect demographic reports around a point location. For example, you can take the coordinates of a coffee shop and find the average population characteristics, such as total population, educational attainment, housing and income information around that location. You can use raw street addresses by combining the Demographic Snapshot with CARTO's geocoding features. If you need help creating coordinates from addresses, see the [Geocoding Functions](/carto-engine/dataservices-api/geocoding-functions/) documentation.
The Demographic Snapshot enables you to collect demographic reports around a point location. For example, you can take the coordinates of a coffee shop and find the average population characteristics, such as total population, educational attainment, housing and income information around that location. You can use raw street addresses by combining the Demographic Snapshot with CARTO's geocoding features. If you need help creating coordinates from addresses, see the [Geocoding Functions](https://carto.com/docs/carto-engine/dataservices-api/geocoding-functions/) documentation.
_**Note:** The Demographic Snapshot functions are only available for the United States._

View File

@@ -290,8 +290,7 @@ Geocodes a postal code from a specified country into an IP address, displayed as
Name | Type | Description
--- | --- | ---
`ip_address` | `text` | Postal code
`country_name` | `text` | IPv4 or IPv6 address
`ip_address` | `text` | IPv4 or IPv6 address
#### Returns

View File

@@ -0,0 +1,69 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.16.0'" to load this file. \quit
-- Here goes your code to upgrade/downgrade
-- This is done in order to avoid an undesired depedency on cartodb extension
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_conf_getconf(input_key text)
RETURNS JSON AS $$
SELECT VALUE FROM cartodb.cdb_conf WHERE key = input_key;
$$ LANGUAGE SQL STABLE SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
import cartodb_services
cartodb_services.init(plpy, GD)
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.tools import Logger
from cartodb_services.refactor.tools.logger import LoggerConfigBuilder
from cartodb_services.refactor.service.mapzen_geocoder_config import MapzenGeocoderConfigBuilder
from cartodb_services.refactor.core.environment import ServerEnvironmentBuilder
from cartodb_services.refactor.backend.server_config import ServerConfigBackendFactory
from cartodb_services.refactor.backend.user_config import UserConfigBackendFactory
from cartodb_services.refactor.backend.org_config import OrgConfigBackendFactory
from cartodb_services.refactor.backend.redis_metrics_connection import RedisMetricsConnectionFactory
server_config_backend = ServerConfigBackendFactory().get()
environment = ServerEnvironmentBuilder(server_config_backend).get()
user_config_backend = UserConfigBackendFactory(username, environment, server_config_backend).get()
org_config_backend = OrgConfigBackendFactory(orgname, environment, server_config_backend).get()
logger_config = LoggerConfigBuilder(environment, server_config_backend).get()
logger = Logger(logger_config)
mapzen_geocoder_config = MapzenGeocoderConfigBuilder(server_config_backend, user_config_backend, org_config_backend, username, orgname).get()
redis_metrics_connection = RedisMetricsConnectionFactory(environment, server_config_backend).get()
quota_service = QuotaService(mapzen_geocoder_config, redis_metrics_connection)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
geocoder = MapzenGeocoder(mapzen_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country:
country_iso3 = country_to_iso3(country)
coordinates = geocoder.geocode(searchtext=searchtext, city=city,
state_province=state_province,
country=country_iso3, search_type='address')
if coordinates:
quota_service.increment_success_service_use()
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])
point = plpy.execute(plan, [coordinates[0], coordinates[1]], 1)[0]
return point['st_setsrid']
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode street point using mapzen', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode street point using mapzen')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;

View File

@@ -0,0 +1,54 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.15.1'" to load this file. \quit
-- Here goes your code to upgrade/downgrade
DROP FUNCTION IF EXISTS cdb_dataservices_server.cdb_conf_getconf(text);
-- Geocodes a street address given a searchtext and a state and/or country
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
import cartodb_services
cartodb_services.init(plpy, GD)
from cartodb_services.config.user import User
from cartodb_services.config.configs import ConfigsFactory
from cartodb_services.config.hires_geocoder_config import HiResGeocoderConfigFactory
from cartodb_services.request.request import RequestFactory
user = User(username, orgname)
configs = ConfigsFactory.get(user)
request = RequestFactory().create(user, configs, 'cdb_geocode_street_point')
# TODO change to hires_geocoder_config = HiResGeocoderConfigFactory.get(request)
hires_geocoder_config = HiResGeocoderConfigFactory(configs).get(user)
if hires_geocoder_config.provider == 'here':
here_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_here_geocode_street_point($1, $2, $3, $4, $5, $6) as point; ", ["text", "text", "text", "text", "text", "text"])
return plpy.execute(here_plan, [username, orgname, searchtext, city, state_province, country], 1)[0]['point']
elif hires_geocoder_config.provider == 'google':
google_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_google_geocode_street_point($1, $2, $3, $4, $5, $6) as point; ", ["text", "text", "text", "text", "text", "text"])
return plpy.execute(google_plan, [username, orgname, searchtext, city, state_province, country], 1)[0]['point']
elif hires_geocoder_config.provider == 'mapzen':
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_street_point($1, $2, $3, $4, $5, $6) as point; ", ["text", "text", "text", "text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, searchtext, city, state_province, country], 1)[0]['point']
else:
raise Exception('Requested geocoder is not available')
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_here_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
if user_geocoder_config.heremaps_geocoder:
here_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_here_geocode_street_point($1, $2, $3, $4, $5, $6) as point; ", ["text", "text", "text", "text", "text", "text"])
return plpy.execute(here_plan, [username, orgname, searchtext, city, state_province, country], 1)[0]['point']
else:
raise Exception('Here geocoder is not available for your account.')
$$ LANGUAGE plpythonu;

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
comment = 'CartoDB dataservices server extension'
default_version = '0.14.2'
default_version = '0.16.0'
requires = 'plpythonu, plproxy, postgis, cdb_geocoder'
superuser = true
schema = cdb_dataservices_server

View File

@@ -0,0 +1,181 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.15.0'" to load this file. \quit
-- HERE goes your code to upgrade/downgrade
DROP FUNCTION IF EXISTS cdb_dataservices_server._get_geocoder_config(text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_server._get_geocoder_config(username text, orgname text, provider text DEFAULT NULL)
RETURNS boolean AS $$
cache_key = "user_geocoder_config_{0}".format(username)
if cache_key in GD:
return False
else:
from cartodb_services.metrics import GeocoderConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metadata_connection']
geocoder_config = GeocoderConfig(redis_conn, plpy, username, orgname, provider)
GD[cache_key] = geocoder_config
return True
$$ LANGUAGE plpythonu SECURITY DEFINER;
---- cdb_geocode_namedplace_point(city_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text)
RETURNS Geometry AS $$
try:
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3) as point;", ["text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name])[0]['point']
except BaseException as e:
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3) as point;", ["text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name])[0]['point']
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, country_name text)
RETURNS Geometry AS $$
try:
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3, NULL, $4) as point;", ["text", "text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name, country_name])[0]['point']
except BaseException as e:
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3, NULL, $4) as point;", ["text", "text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name, country_name])[0]['point']
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, admin1_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, admin1_name text, country_name text)
RETURNS Geometry AS $$
try:
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3, $4, $5) as point;", ["text", "text", "text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name, admin1_name, country_name])[0]['point']
except BaseException as e:
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3, $4, $5) as point;", ["text", "text", "text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name, admin1_name, country_name])[0]['point']
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_namedplace(username text, orgname text, city_name text, admin1_name text DEFAULT NULL, country_name text DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_geocoder_config({0}, {1}, {2})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname), plpy.quote_nullable('mapzen')))
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
geocoder = MapzenGeocoder(user_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country_name:
country_iso3 = country_to_iso3(country_name)
coordinates = geocoder.geocode(searchtext=city_name, city=None,
state_province=admin1_name,
country=country_iso3, search_type='locality')
if coordinates:
quota_service.increment_success_service_use()
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])
point = plpy.execute(plan, [coordinates[0], coordinates[1]], 1)[0]
return point['st_setsrid']
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode city point using mapzen', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode city point using mapzen')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_internal_geocode_namedplace(username text, orgname text, city_name text, admin1_name text DEFAULT NULL, country_name text DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
if admin1_name and country_name:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2), trim($3)) AS mypoint", ["text", "text", "text"])
rv = plpy.execute(plan, [city_name, admin1_name, country_name], 1)
elif country_name:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2)) AS mypoint", ["text", "text"])
rv = plpy.execute(plan, [city_name, country_name], 1)
else:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1)) AS mypoint", ["text"])
rv = plpy.execute(plan, [city_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()
return result
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.tools import Logger,LoggerConfig
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
geocoder = MapzenGeocoder(user_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country:
country_iso3 = country_to_iso3(country)
coordinates = geocoder.geocode(searchtext=searchtext, city=city,
state_province=state_province,
country=country_iso3, search_type='address')
if coordinates:
quota_service.increment_success_service_use()
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])
point = plpy.execute(plan, [coordinates[0], coordinates[1]], 1)[0]
return point['st_setsrid']
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode street point using mapzen', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode street point using mapzen')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu

View File

@@ -0,0 +1,169 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.14.2'" to load this file. \quit
-- HERE goes your code to upgrade/downgrade
DROP FUNCTION IF EXISTS cdb_dataservices_server._get_geocoder_config(text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server._cdb_mapzen_geocode_namedplace(text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server._cdb_internal_geocode_namedplace(text, text, text, text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_server._get_geocoder_config(username text, orgname text)
RETURNS boolean AS $$
cache_key = "user_geocoder_config_{0}".format(username)
if cache_key in GD:
return False
else:
from cartodb_services.metrics import GeocoderConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metadata_connection']
geocoder_config = GeocoderConfig(redis_conn, plpy, username, orgname)
GD[cache_key] = geocoder_config
return True
$$ LANGUAGE plpythonu SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1)) AS mypoint", ["text"])
rv = plpy.execute(plan, [city_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()
return result
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, country_name text)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2)) AS mypoint", ["text", "text"])
rv = plpy.execute(plan, [city_name, country_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()
return result
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, admin1_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, admin1_name text, country_name text)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2), trim($3)) AS mypoint", ["text", "text", "text"])
rv = plpy.execute(plan, [city_name, admin1_name, country_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()
return result
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.tools import Logger,LoggerConfig
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
geocoder = MapzenGeocoder(user_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country:
country_iso3 = country_to_iso3(country)
coordinates = geocoder.geocode(searchtext=searchtext, city=city,
state_province=state_province,
country=country_iso3)
if coordinates:
quota_service.increment_success_service_use()
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])
point = plpy.execute(plan, [coordinates[0], coordinates[1]], 1)[0]
return point['st_setsrid']
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode street point using mapzen', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode street point using mapzen')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;

View File

@@ -0,0 +1,44 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.15.1'" to load this file. \quit
-- HERE goes your code to upgrade/downgrade
DROP FUNCTION IF EXISTS cdb_dataservices_server._OBS_ConnectUserTable(text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server.__OBS_ConnectUserTable(text, text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server._OBS_GetReturnMetadata(text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_server._OBS_FetchJoinFdwTableData(text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_server._OBS_DisconnectUserTable(text, text, text, text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
host_addr = plpy.execute("SELECT split_part(inet_client_addr()::text, '/', 1) as user_host")[0]['user_host']
return plpy.execute("SELECT * FROM cdb_dataservices_server.__DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {host_addr}::text, {table_name}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), schema=plpy.quote_literal(input_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name), host_addr=plpy.quote_literal(host_addr))
)[0]
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server.__DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, host_addr text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_server.ds_return_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS SETOF record AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
RETURNS boolean AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_DisconnectUserTable;
$$ LANGUAGE plproxy;

View File

@@ -0,0 +1,44 @@
--DO NOT MODIFY THIS FILE, IT IS GENERATED AUTOMATICALLY FROM SOURCES
-- Complain if script is sourced in psql, rather than via CREATE EXTENSION
\echo Use "ALTER EXTENSION cdb_dataservices_server UPDATE TO '0.15.0'" to load this file. \quit
-- HERE goes your code to upgrade/downgrade
DROP FUNCTION IF EXISTS cdb_dataservices_server._DST_ConnectUserTable(text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server.__DST_ConnectUserTable(text, text, text, text, text, text, text);
DROP FUNCTION IF EXISTS cdb_dataservices_server._DST_GetReturnMetadata(text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_server._DST_FetchJoinFdwTableData(text, text, text, text, text, json);
DROP FUNCTION IF EXISTS cdb_dataservices_server._DST_DisconnectUserTable(text, text, text, text, text);
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
host_addr = plpy.execute("SELECT split_part(inet_client_addr()::text, '/', 1) as user_host")[0]['user_host']
return plpy.execute("SELECT * FROM cdb_dataservices_server.__OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {host_addr}::text, {table_name}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), schema=plpy.quote_literal(input_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name), host_addr=plpy.quote_literal(host_addr))
)[0]
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server.__OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, host_addr text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_server.ds_return_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS SETOF record AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
RETURNS boolean AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_DisconnectUserTable;
$$ LANGUAGE plproxy;

File diff suppressed because it is too large Load Diff

View File

@@ -2,34 +2,34 @@ CREATE TYPE cdb_dataservices_server.ds_fdw_metadata as (schemaname text, tabname
CREATE TYPE cdb_dataservices_server.ds_return_metadata as (colnames text[], coltypes text[]);
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
host_addr = plpy.execute("SELECT split_part(inet_client_addr()::text, '/', 1) as user_host")[0]['user_host']
return plpy.execute("SELECT * FROM cdb_dataservices_server.__OBS_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {host_addr}::text, {table_name}::text)"
return plpy.execute("SELECT * FROM cdb_dataservices_server.__DST_ConnectUserTable({username}::text, {orgname}::text, {user_db_role}::text, {schema}::text, {dbname}::text, {host_addr}::text, {table_name}::text)"
.format(username=plpy.quote_nullable(username), orgname=plpy.quote_nullable(orgname), user_db_role=plpy.quote_literal(user_db_role), schema=plpy.quote_literal(input_schema), dbname=plpy.quote_literal(dbname), table_name=plpy.quote_literal(table_name), host_addr=plpy.quote_literal(host_addr))
)[0]
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server.__OBS_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, host_addr text, table_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.__DST_ConnectUserTable(username text, orgname text, user_db_role text, input_schema text, dbname text, host_addr text, table_name text)
RETURNS cdb_dataservices_server.ds_fdw_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_ConnectUserTable;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_GetReturnMetadata(username text, orgname text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_GetReturnMetadata(username text, orgname text, function_name text, params json)
RETURNS cdb_dataservices_server.ds_return_metadata AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_GetReturnMetadata;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_FetchJoinFdwTableData(username text, orgname text, table_schema text, table_name text, function_name text, params json)
RETURNS SETOF record AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_FetchJoinFdwTableData;
$$ LANGUAGE plproxy;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._OBS_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._DST_DisconnectUserTable(username text, orgname text, table_schema text, table_name text, servername text)
RETURNS boolean AS $$
CONNECT cdb_dataservices_server._obs_server_conn_str(username, orgname);
TARGET cdb_observatory._OBS_DisconnectUserTable;

View File

@@ -10,7 +10,13 @@ RETURNS boolean AS $$
return True
$$ LANGUAGE plpythonu SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._get_geocoder_config(username text, orgname text)
-- This is done in order to avoid an undesired depedency on cartodb extension
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_conf_getconf(input_key text)
RETURNS JSON AS $$
SELECT VALUE FROM cartodb.cdb_conf WHERE key = input_key;
$$ LANGUAGE SQL STABLE SECURITY DEFINER;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._get_geocoder_config(username text, orgname text, provider text DEFAULT NULL)
RETURNS boolean AS $$
cache_key = "user_geocoder_config_{0}".format(username)
if cache_key in GD:
@@ -19,7 +25,7 @@ RETURNS boolean AS $$
from cartodb_services.metrics import GeocoderConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metadata_connection']
geocoder_config = GeocoderConfig(redis_conn, plpy, username, orgname)
geocoder_config = GeocoderConfig(redis_conn, plpy, username, orgname, provider)
GD[cache_key] = geocoder_config
return True
$$ LANGUAGE plpythonu SECURITY DEFINER;

View File

@@ -137,29 +137,44 @@ $$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_street_point(username TEXT, orgname TEXT, searchtext TEXT, city TEXT DEFAULT NULL, state_province TEXT DEFAULT NULL, country TEXT DEFAULT NULL)
RETURNS Geometry AS $$
import cartodb_services
cartodb_services.init(plpy, GD)
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.tools import Logger,LoggerConfig
from cartodb_services.tools import Logger
from cartodb_services.refactor.tools.logger import LoggerConfigBuilder
from cartodb_services.refactor.service.mapzen_geocoder_config import MapzenGeocoderConfigBuilder
from cartodb_services.refactor.core.environment import ServerEnvironmentBuilder
from cartodb_services.refactor.backend.server_config import ServerConfigBackendFactory
from cartodb_services.refactor.backend.user_config import UserConfigBackendFactory
from cartodb_services.refactor.backend.org_config import OrgConfigBackendFactory
from cartodb_services.refactor.backend.redis_metrics_connection import RedisMetricsConnectionFactory
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
server_config_backend = ServerConfigBackendFactory().get()
environment = ServerEnvironmentBuilder(server_config_backend).get()
user_config_backend = UserConfigBackendFactory(username, environment, server_config_backend).get()
org_config_backend = OrgConfigBackendFactory(orgname, environment, server_config_backend).get()
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger_config = LoggerConfigBuilder(environment, server_config_backend).get()
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
mapzen_geocoder_config = MapzenGeocoderConfigBuilder(server_config_backend, user_config_backend, org_config_backend, username, orgname).get()
redis_metrics_connection = RedisMetricsConnectionFactory(environment, server_config_backend).get()
quota_service = QuotaService(mapzen_geocoder_config, redis_metrics_connection)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
geocoder = MapzenGeocoder(user_geocoder_config.mapzen_api_key, logger)
geocoder = MapzenGeocoder(mapzen_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country:
country_iso3 = country_to_iso3(country)
coordinates = geocoder.geocode(searchtext=searchtext, city=city,
state_province=state_province,
country=country_iso3)
country=country_iso3, search_type='address')
if coordinates:
quota_service.increment_success_service_use()
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])

View File

@@ -1,76 +1,81 @@
---- cdb_geocode_namedplace_point(city_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1)) AS mypoint", ["text"])
rv = plpy.execute(plan, [city_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()
return result
else:
quota_service.increment_empty_service_use()
return None
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3) as point;", ["text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name])[0]['point']
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
finally:
quota_service.increment_total_service_use()
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3) as point;", ["text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name])[0]['point']
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, country_name text)
RETURNS Geometry AS $$
try:
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3, NULL, $4) as point;", ["text", "text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name, country_name])[0]['point']
except BaseException as e:
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3, NULL, $4) as point;", ["text", "text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name, country_name])[0]['point']
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, admin1_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, admin1_name text, country_name text)
RETURNS Geometry AS $$
try:
mapzen_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_mapzen_geocode_namedplace($1, $2, $3, $4, $5) as point;", ["text", "text", "text", "text", "text"])
return plpy.execute(mapzen_plan, [username, orgname, city_name, admin1_name, country_name])[0]['point']
except BaseException as e:
internal_plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_internal_geocode_namedplace($1, $2, $3, $4, $5) as point;", ["text", "text", "text", "text", "text"])
return plpy.execute(internal_plan, [username, orgname, city_name, admin1_name, country_name])[0]['point']
$$ LANGUAGE plpythonu;
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_mapzen_geocode_namedplace(username text, orgname text, city_name text, admin1_name text DEFAULT NULL, country_name text DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.types import country_to_iso3
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
from cartodb_services.tools import Logger,LoggerConfig
plpy.execute("SELECT cdb_dataservices_server._connect_to_redis('{0}')".format(username))
redis_conn = GD["redis_connection_{0}".format(username)]['redis_metrics_connection']
plpy.execute("SELECT cdb_dataservices_server._get_internal_geocoder_config({0}, {1})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname)))
user_geocoder_config = GD["user_internal_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_geocoder_config({0}, {1}, {2})".format(plpy.quote_nullable(username), plpy.quote_nullable(orgname), plpy.quote_nullable('mapzen')))
user_geocoder_config = GD["user_geocoder_config_{0}".format(username)]
plpy.execute("SELECT cdb_dataservices_server._get_logger_config()")
logger_config = GD["logger_config"]
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
if not quota_service.check_user_quota():
raise Exception('You have reached the limit of your quota')
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2)) AS mypoint", ["text", "text"])
rv = plpy.execute(plan, [city_name, country_name], 1)
result = rv[0]["mypoint"]
if result:
geocoder = MapzenGeocoder(user_geocoder_config.mapzen_api_key, logger)
country_iso3 = None
if country_name:
country_iso3 = country_to_iso3(country_name)
coordinates = geocoder.geocode(searchtext=city_name, city=None,
state_province=admin1_name,
country=country_iso3, search_type='locality')
if coordinates:
quota_service.increment_success_service_use()
return result
plan = plpy.prepare("SELECT ST_SetSRID(ST_MakePoint($1, $2), 4326); ", ["double precision", "double precision"])
point = plpy.execute(plan, [coordinates[0], coordinates[1]], 1)[0]
return point['st_setsrid']
else:
quota_service.increment_empty_service_use()
return None
except BaseException as e:
import sys
quota_service.increment_failed_service_use()
logger.error('Error trying to geocode namedplace point', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode namedplace point')
logger.error('Error trying to geocode city point using mapzen', sys.exc_info(), data={"username": username, "orgname": orgname})
raise Exception('Error trying to geocode city point using mapzen')
finally:
quota_service.increment_total_service_use()
$$ LANGUAGE plpythonu;
---- cdb_geocode_namedplace_point(city_name text, admin1_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_geocode_namedplace_point(username text, orgname text, city_name text, admin1_name text, country_name text)
CREATE OR REPLACE FUNCTION cdb_dataservices_server._cdb_internal_geocode_namedplace(username text, orgname text, city_name text, admin1_name text DEFAULT NULL, country_name text DEFAULT NULL)
RETURNS Geometry AS $$
from cartodb_services.metrics import QuotaService
from cartodb_services.metrics import InternalGeocoderConfig
@@ -86,8 +91,15 @@ RETURNS Geometry AS $$
logger = Logger(logger_config)
quota_service = QuotaService(user_geocoder_config, redis_conn)
try:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2), trim($3)) AS mypoint", ["text", "text", "text"])
rv = plpy.execute(plan, [city_name, admin1_name, country_name], 1)
if admin1_name and country_name:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2), trim($3)) AS mypoint", ["text", "text", "text"])
rv = plpy.execute(plan, [city_name, admin1_name, country_name], 1)
elif country_name:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1), trim($2)) AS mypoint", ["text", "text"])
rv = plpy.execute(plan, [city_name, country_name], 1)
else:
plan = plpy.prepare("SELECT cdb_dataservices_server._cdb_geocode_namedplace_point(trim($1)) AS mypoint", ["text"])
rv = plpy.execute(plan, [city_name], 1)
result = rv[0]["mypoint"]
if result:
quota_service.increment_success_service_use()

View File

@@ -4,7 +4,6 @@ CREATE EXTENSION plpythonu;
CREATE EXTENSION plproxy;
CREATE EXTENSION cartodb;
CREATE EXTENSION cdb_geocoder;
CREATE EXTENSION observatory VERSION 'dev';
-- Install the extension
CREATE EXTENSION cdb_dataservices_server;
-- Mock the redis server connection to point to this very test db

View File

@@ -2,7 +2,7 @@ SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_connectusertable'
AND proname = '_dst_connectusertable'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text, text');
exists
--------
@@ -13,7 +13,7 @@ SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_getreturnmetadata'
AND proname = '_dst_getreturnmetadata'
AND oidvectortypes(p.proargtypes) = 'text, text, text, json');
exists
--------
@@ -24,7 +24,7 @@ SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_fetchjoinfdwtabledata'
AND proname = '_dst_fetchjoinfdwtabledata'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text, json');
exists
--------
@@ -35,7 +35,7 @@ SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_disconnectusertable'
AND proname = '_dst_disconnectusertable'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text');
exists
--------

View File

@@ -27,7 +27,7 @@ INSERT INTO global_cities_alternates_limited (geoname_id, name, preferred, lower
'POINT(0.6983 39.26787)',4326)
);
-- Insert dummy data into country decoder table
INSERT INTO country_decoder (synonyms, iso2) VALUES (Array['spain'], 'ES');
INSERT INTO country_decoder (synonyms, iso2) VALUES (Array['spain', 'Spain'], 'ES');
-- Insert dummy data into admin1 decoder table
INSERT INTO admin1_decoder (admin1, synonyms, iso2) VALUES ('Valencia', Array['valencia', 'Valencia'], 'ES');
-- This should return the point inserted above

View File

@@ -4,7 +4,6 @@ CREATE EXTENSION plpythonu;
CREATE EXTENSION plproxy;
CREATE EXTENSION cartodb;
CREATE EXTENSION cdb_geocoder;
CREATE EXTENSION observatory VERSION 'dev';
-- Install the extension
CREATE EXTENSION cdb_dataservices_server;

View File

@@ -2,27 +2,27 @@ SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_connectusertable'
AND proname = '_dst_connectusertable'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text, text');
SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_getreturnmetadata'
AND proname = '_dst_getreturnmetadata'
AND oidvectortypes(p.proargtypes) = 'text, text, text, json');
SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_fetchjoinfdwtabledata'
AND proname = '_dst_fetchjoinfdwtabledata'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text, json');
SELECT exists(SELECT *
FROM pg_proc p
INNER JOIN pg_namespace ns ON (p.pronamespace = ns.oid)
WHERE ns.nspname = 'cdb_dataservices_server'
AND proname = '_obs_disconnectusertable'
AND proname = '_dst_disconnectusertable'
AND oidvectortypes(p.proargtypes) = 'text, text, text, text, text');

View File

@@ -15,7 +15,7 @@ INSERT INTO global_cities_alternates_limited (geoname_id, name, preferred, lower
);
-- Insert dummy data into country decoder table
INSERT INTO country_decoder (synonyms, iso2) VALUES (Array['spain'], 'ES');
INSERT INTO country_decoder (synonyms, iso2) VALUES (Array['spain', 'Spain'], 'ES');
-- Insert dummy data into admin1 decoder table
INSERT INTO admin1_decoder (admin1, synonyms, iso2) VALUES ('Valencia', Array['valencia', 'Valencia'], 'ES');

View File

@@ -1,4 +1,4 @@
# CartoDB dataservices API python module
# CARTO dataservices API python module
This directory contains the python library used by the server side of CARTO LDS (Location Data Services).

View File

@@ -0,0 +1,35 @@
# NOTE: This init function must be called from plpythonu entry points to
# initialize cartodb_services module properly. E.g:
#
# CREATE OR REPLACE FUNCTION cdb_dataservices_server.cdb_isochrone(...)
# RETURNS SETOF cdb_dataservices_server.isoline AS $$
#
# import cartodb_services
# cartodb_services.init(plpy, GD)
#
# # rest of the code here
# cartodb_services.GD[key] = val
# cartodb_services.plpy.execute('SELECT * FROM ...')
#
# $$ LANGUAGE plpythonu;
plpy = None
GD = None
def init(_plpy, _GD):
global plpy
global GD
if plpy is None:
plpy = _plpy
if GD is None:
GD = _GD
def _reset():
# NOTE: just for testing
global plpy
global GD
plpy = None
GD = None

View File

@@ -14,6 +14,8 @@ class HereMapsGeocoder:
STAGING_GEOCODE_JSON_URL = 'https://geocoder.cit.api.here.com/6.2/geocode.json'
DEFAULT_MAXRESULTS = 1
DEFAULT_GEN = 9
READ_TIMEOUT = 60
CONNECT_TIMEOUT = 10
ADDRESS_PARAMS = [
'city',
@@ -85,7 +87,8 @@ class HereMapsGeocoder:
'gen': self.gen
}
request_params.update(params)
response = requests.get(self.host, params=request_params)
response = requests.get(self.host, params=request_params,
timeout=(self.CONNECT_TIMEOUT, self.READ_TIMEOUT))
if response.status_code == requests.codes.ok:
return json.loads(response.text)
elif response.status_code == requests.codes.bad_request:

View File

@@ -10,6 +10,8 @@ class HereMapsRoutingIsoline:
PRODUCTION_ROUTING_BASE_URL = 'https://isoline.route.api.here.com'
STAGING_ROUTING_BASE_URL = 'https://isoline.route.cit.api.here.com'
ISOLINE_PATH = '/routing/7.2/calculateisoline.json'
READ_TIMEOUT = 60
CONNECT_TIMEOUT = 10
ACCEPTED_MODES = {
"walk": "pedestrian",
@@ -50,7 +52,8 @@ class HereMapsRoutingIsoline:
data_range,
range_type,
parsed_options)
response = requests.get(self._url, params=request_params)
response = requests.get(self._url, params=request_params,
timeout=(self.CONNECT_TIMEOUT, self.READ_TIMEOUT))
if response.status_code == requests.codes.ok:
return self.__parse_isolines_response(response.text)
elif response.status_code == requests.codes.bad_request:

View File

@@ -19,3 +19,15 @@ class MalformedResult(Exception):
class TimeoutException(Exception):
def __str__(self):
return repr('Timeout requesting to mapzen server')
class ServiceException(Exception):
def __init__(self, message, response):
self.message = message
self.response = response
def response(self):
return self.response
def __str__(self):
return self.message

View File

@@ -2,7 +2,7 @@ import requests
import json
import re
from exceptions import WrongParams, MalformedResult
from exceptions import WrongParams, MalformedResult, ServiceException
from qps import qps_retry
from cartodb_services.tools import Coordinate, PolyLine
@@ -11,19 +11,23 @@ class MapzenGeocoder:
'A Mapzen Geocoder wrapper for python'
BASE_URL = 'https://search.mapzen.com/v1/search'
READ_TIMEOUT = 60
CONNECT_TIMEOUT = 10
def __init__(self, app_key, logger, base_url=BASE_URL):
self._app_key = app_key
self._url = base_url
self._logger = logger
@qps_retry
def geocode(self, searchtext, city=None, state_province=None, country=None):
@qps_retry(qps=20)
def geocode(self, searchtext, city=None, state_province=None,
country=None, search_type=None):
request_params = self._build_requests_parameters(searchtext, city,
state_province,
country)
country, search_type)
try:
response = requests.get(self._url, params=request_params)
response = requests.get(self._url, params=request_params,
timeout=(self.CONNECT_TIMEOUT, self.READ_TIMEOUT))
if response.status_code == requests.codes.ok:
return self.__parse_response(response.text)
elif response.status_code == requests.codes.bad_request:
@@ -31,30 +35,38 @@ class MapzenGeocoder:
else:
self._logger.error('Error trying to geocode using mapzen',
data={"response_status": response.status_code,
"response_reason": response.reason,
"response_content": response.text,
"reponse_url": response.url,
"response_headers": response.headers,
"searchtext": searchtext,
"city": city, "country": country,
"state_province": state_province })
raise Exception('Error trying to geocode {0} using mapzen'.format(searchtext))
"response_reason": response.reason,
"response_content": response.text,
"reponse_url": response.url,
"response_headers": response.headers,
"searchtext": searchtext,
"city": city, "country": country,
"state_province": state_province})
raise ServiceException('Error trying to geocode {0} using mapzen'.format(searchtext),
response)
except requests.Timeout as te:
# In case of timeout we want to stop the job because the server
# could be down
self._logger.error('Timeout connecting to Mapzen geocoding server')
raise ServiceException('Error trying to geocode {0} using mapzen'.format(searchtext),
None)
except requests.ConnectionError as e:
# Don't raise the exception to continue with the geocoding job
self._logger.error('Error connecting to Mapzen geocoding server',
exception=e)
return []
def _build_requests_parameters(self, searchtext, city=None,
state_province=None, country=None):
state_province=None, country=None,
search_type=None):
request_params = {}
search_string = self._build_search_text(searchtext.strip(),
city,
state_province)
request_params['text'] = search_string
request_params['layers'] = 'address'
request_params['api_key'] = self._app_key
if search_type:
request_params['layers'] = search_type
if country:
request_params['boundary.country'] = country
return request_params

View File

@@ -86,7 +86,7 @@ class MapzenIsolines:
def calculate_isoline(self, origin, costing_model, isorange, upper_rmax, cost_variable, unit_factor=1.0):
# NOTE: not for production
self._logger.debug('Calculate isoline', data={"origin": origin, "costing_model": costing_model, "isorange": isorange})
# self._logger.debug('Calculate isoline', data={"origin": origin, "costing_model": costing_model, "isorange": isorange})
# Formally, a solution is an array of {angle, radius, lat, lon, cost} with cardinality NUMBER_OF_ANGLES
# we're looking for a solution in which abs(cost - isorange) / isorange <= TOLERANCE
@@ -105,14 +105,16 @@ class MapzenIsolines:
response = self._matrix_client.one_to_many([origin] + location_estimates, costing_model)
costs = [None] * self.NUMBER_OF_ANGLES
if not response:
# In case the matrix client doesn't return any data
break
for idx, c in enumerate(response['one_to_many'][0][1:]):
if c[cost_variable]:
costs[idx] = c[cost_variable]*unit_factor
else:
costs[idx] = isorange
# self._logger.debug('i = %d, costs = %s' % (i, costs))
errors = [(cost - isorange) / float(isorange) for cost in costs]
max_abs_error = max([abs(e) for e in errors])
if max_abs_error <= self.TOLERANCE:

View File

@@ -1,6 +1,7 @@
import requests
import json
from qps import qps_retry
from exceptions import ServiceException
class MatrixClient:
@@ -18,6 +19,8 @@ class MatrixClient:
"""
ONE_TO_MANY_URL = 'https://matrix.mapzen.com/one_to_many'
READ_TIMEOUT = 60
CONNECT_TIMEOUT = 10
def __init__(self, matrix_key, logger):
self._matrix_key = matrix_key
@@ -40,9 +43,10 @@ class MatrixClient:
'costing': costing,
'api_key': self._matrix_key
}
response = requests.get(self.ONE_TO_MANY_URL, params=request_params)
response = requests.get(self.ONE_TO_MANY_URL, params=request_params,
timeout=(self.CONNECT_TIMEOUT, self.READ_TIMEOUT))
if not requests.codes.ok:
if response.status_code != requests.codes.ok:
self._logger.error('Error trying to get matrix distance from mapzen',
data={"response_status": response.status_code,
"response_reason": response.reason,
@@ -51,6 +55,22 @@ class MatrixClient:
"response_headers": response.headers,
"locations": locations,
"costing": costing})
raise Exception('Error trying to get matrix distance from mapzen')
# In case 4xx error we return empty because the error comes from
# the provided info by the user and we don't want to top the
# isolines generation
if response.status_code == requests.codes.bad_request:
return {}
elif response.status_code == 504:
# Due to some unsolved problems in the Mapzen Matrix API we're
# getting randomly 504, probably timeouts. To avoid raise an
# exception in all the jobs, for now we're going to return
# empty in that case
return {}
else:
raise ServiceException("Error trying to get matrix distance from mapzen", response)
return response.json()
# response could return with empty json
try:
return response.json()
except:
return {}

View File

@@ -4,18 +4,38 @@ from datetime import datetime
from exceptions import TimeoutException
DEFAULT_RETRY_TIMEOUT = 60
DEFAULT_QUERIES_PER_SECOND = 10
def qps_retry(f):
def wrapped_f(*args, **kw):
return QPSService().call(f, *args, **kw)
return wrapped_f
def qps_retry(original_function=None,**options):
""" Query Per Second retry decorator
The intention of this decorator is to retry requests against third
party services that has QPS restriction.
Parameters:
- timeout: Maximum number of seconds to retry
- qps: Allowed queries per second. This parameter is used to
calculate the next time to retry the request
"""
if original_function is not None:
def wrapped_function(*args, **kwargs):
if 'timeout' in options:
timeout = options['timeout']
else:
timeout = DEFAULT_RETRY_TIMEOUT
if 'qps' in options:
qps = options['qps']
else:
qps = DEFAULT_QUERIES_PER_SECOND
return QPSService(retry_timeout=timeout, queries_per_second=qps).call(original_function, *args, **kwargs)
return wrapped_function
else:
def partial_wrapper(func):
return qps_retry(func, **options)
return partial_wrapper
class QPSService:
def __init__(self, queries_per_second=10,
retry_timeout=DEFAULT_RETRY_TIMEOUT):
def __init__(self, queries_per_second, retry_timeout):
self._queries_per_second = queries_per_second
self._retry_timeout = retry_timeout
@@ -27,7 +47,7 @@ class QPSService:
return fn(*args, **kwargs)
except Exception as e:
response = getattr(e, 'response', None)
if response and (response.status_code == 429):
if response is not None and (response.status_code == 429):
self.retry(start_time, attempt_number)
else:
raise e
@@ -35,7 +55,7 @@ class QPSService:
def retry(self, first_request_time, retry_count):
elapsed = datetime.now() - first_request_time
if elapsed.seconds > self._retry_timeout:
if elapsed.microseconds > (self._retry_timeout * 1000.0):
raise TimeoutException()
# inverse qps * (1.5 ^ i) is an increased sleep time of 1.5x per

View File

@@ -2,7 +2,7 @@ import requests
import json
import re
from exceptions import WrongParams, MalformedResult
from exceptions import WrongParams, MalformedResult, ServiceException
from qps import qps_retry
from cartodb_services.tools import Coordinate, PolyLine
@@ -11,6 +11,8 @@ class MapzenRouting:
'A Mapzen Routing wrapper for python'
PRODUCTION_ROUTING_BASE_URL = 'https://valhalla.mapzen.com/route'
READ_TIMEOUT = 60
CONNECT_TIMEOUT = 10
ACCEPTED_MODES = {
"walk": "pedestrian",
@@ -43,7 +45,8 @@ class MapzenRouting:
mode_param,
units)
request_params = self.__parse_request_parameters(json_request_params)
response = requests.get(self._url, params=request_params)
response = requests.get(self._url, params=request_params,
timeout=(self.CONNECT_TIMEOUT, self.READ_TIMEOUT))
if response.status_code == requests.codes.ok:
return self.__parse_routing_response(response.text)
elif response.status_code == requests.codes.bad_request:
@@ -57,7 +60,7 @@ class MapzenRouting:
"response_headers": response.headers,
"waypoints": waypoints, "mode": mode,
"options": options})
raise Exception('Error trying to calculate route using Mapzen')
raise ServiceException('Error trying to calculate route using Mapzen', response)
def __parse_options(self, options):
return dict(option.split('=') for option in options)

View File

@@ -116,6 +116,8 @@ class RoutingConfig(ServiceConfig):
ROUTING_PROVIDER_KEY = 'routing_provider'
MAPZEN_PROVIDER = 'mapzen'
DEFAULT_PROVIDER = 'mapzen'
QUOTA_KEY = 'mapzen_routing_quota'
SOFT_LIMIT_KEY = 'soft_mapzen_routing_limit'
def __init__(self, redis_connection, db_conn, username, orgname=None):
super(RoutingConfig, self).__init__(redis_connection, db_conn,
@@ -124,7 +126,8 @@ class RoutingConfig(ServiceConfig):
if not self._routing_provider:
self._routing_provider = self.DEFAULT_PROVIDER
self._mapzen_api_key = self._db_config.mapzen_routing_api_key
self._monthly_quota = self._db_config.mapzen_routing_monthly_quota
self._set_monthly_quota()
self._set_soft_limit()
self._period_end_date = date_parse(self._redis_config[self.PERIOD_END_DATE])
@property
@@ -144,6 +147,28 @@ class RoutingConfig(ServiceConfig):
def period_end_date(self):
return self._period_end_date
@property
def soft_limit(self):
return self._soft_limit
def _set_monthly_quota(self):
self._monthly_quota = self._get_effective_monthly_quota()
def _get_effective_monthly_quota(self):
quota_from_redis = self._redis_config.get(self.QUOTA_KEY)
if quota_from_redis and quota_from_redis <> '':
return int(quota_from_redis)
else:
return self._db_config.mapzen_routing_monthly_quota
def _set_soft_limit(self):
if self.SOFT_LIMIT_KEY in self._redis_config and self._redis_config[self.SOFT_LIMIT_KEY].lower() == 'true':
self._soft_limit = True
else:
self._soft_limit = False
class IsolinesRoutingConfig(ServiceConfig):
@@ -286,11 +311,11 @@ class GeocoderConfig(ServiceConfig):
PERIOD_END_DATE = 'period_end_date'
DEFAULT_PROVIDER = 'mapzen'
def __init__(self, redis_connection, db_conn, username, orgname=None):
def __init__(self, redis_connection, db_conn, username, orgname=None, forced_provider=None):
super(GeocoderConfig, self).__init__(redis_connection, db_conn,
username, orgname)
filtered_config = {key: self._redis_config[key] for key in self.GEOCODER_CONFIG_KEYS if key in self._redis_config.keys()}
self.__parse_config(filtered_config, self._db_config)
self.__parse_config(filtered_config, self._db_config, forced_provider)
self.__check_config(filtered_config)
def __check_config(self, filtered_config):
@@ -307,9 +332,12 @@ class GeocoderConfig(ServiceConfig):
return True
def __parse_config(self, filtered_config, db_config):
self._geocoder_provider = filtered_config[self.GEOCODER_PROVIDER].lower()
if not self._geocoder_provider:
def __parse_config(self, filtered_config, db_config, forced_provider):
if forced_provider:
self._geocoder_provider = forced_provider
elif filtered_config[self.GEOCODER_PROVIDER].lower():
self._geocoder_provider = filtered_config[self.GEOCODER_PROVIDER].lower()
else:
self._geocoder_provider = self.DEFAULT_PROVIDER
self._geocoding_quota = float(filtered_config[self.QUOTA_KEY])
self._period_end_date = date_parse(filtered_config[self.PERIOD_END_DATE])
@@ -544,6 +572,7 @@ class ServicesRedisConfig:
GOOGLE_GEOCODER_CLIENT_ID = 'google_maps_client_id'
QUOTA_KEY = 'geocoding_quota'
ISOLINES_QUOTA_KEY = 'here_isolines_quota'
ROUTING_QUOTA_KEY = 'mapzen_routing_quota'
OBS_SNAPSHOT_QUOTA_KEY = 'obs_snapshot_quota'
OBS_GENERAL_QUOTA_KEY = 'obs_general_quota'
PERIOD_END_DATE = 'period_end_date'
@@ -582,8 +611,12 @@ class ServicesRedisConfig:
if not org_config:
raise ConfigException("""There is no organization config available. Please check your configuration.'""")
else:
user_config[self.QUOTA_KEY] = org_config[self.QUOTA_KEY]
user_config[self.ISOLINES_QUOTA_KEY] = org_config[self.ISOLINES_QUOTA_KEY]
if self.QUOTA_KEY in org_config:
user_config[self.QUOTA_KEY] = org_config[self.QUOTA_KEY]
if self.ISOLINES_QUOTA_KEY in org_config:
user_config[self.ISOLINES_QUOTA_KEY] = org_config[self.ISOLINES_QUOTA_KEY]
if self.ROUTING_QUOTA_KEY in org_config:
user_config[self.ROUTING_QUOTA_KEY] = org_config[self.ROUTING_QUOTA_KEY]
if self.OBS_SNAPSHOT_QUOTA_KEY in org_config:
user_config[self.OBS_SNAPSHOT_QUOTA_KEY] = org_config[self.OBS_SNAPSHOT_QUOTA_KEY]
if self.OBS_GENERAL_QUOTA_KEY in org_config:

View File

@@ -93,7 +93,7 @@ class QuotaChecker:
current_used = self._user_service.used_quota(service_type, today)
soft_geocoding_limit = self._user_service_config.soft_geocoding_limit
if soft_geocoding_limit or (user_quota > 0 and current_used <= user_quota):
if soft_geocoding_limit or (user_quota > 0 and current_used < user_quota):
return True
else:
return False
@@ -105,7 +105,7 @@ class QuotaChecker:
current_used = self._user_service.used_quota(service_type, today)
soft_isolines_limit = self._user_service_config.soft_isolines_limit
if soft_isolines_limit or (user_quota > 0 and current_used <= user_quota):
if soft_isolines_limit or (user_quota > 0 and current_used < user_quota):
return True
else:
return False
@@ -115,8 +115,9 @@ class QuotaChecker:
today = date.today()
service_type = self._user_service_config.service_type
current_used = self._user_service.used_quota(service_type, today)
soft_limit = self._user_service_config.soft_limit
if (user_quota > 0 and current_used <= user_quota):
if soft_limit or (user_quota > 0 and current_used < user_quota):
return True
else:
return False
@@ -128,7 +129,7 @@ class QuotaChecker:
service_type = self._user_service_config.service_type
current_used = self._user_service.used_quota(service_type, today)
if soft_limit or (user_quota > 0 and current_used <= user_quota):
if soft_limit or (user_quota > 0 and current_used < user_quota):
return True
else:
return False

View File

@@ -8,6 +8,7 @@ class UserMetricsService:
SERVICE_GEOCODER_NOKIA = 'geocoder_here'
SERVICE_GEOCODER_CACHE = 'geocoder_cache'
SERVICE_HERE_ISOLINES = 'here_isolines'
SERVICE_MAPZEN_ROUTING = 'routing_mapzen'
DAY_OF_MONTH_ZERO_PADDED = '%d'
def __init__(self, user_geocoder_config, redis_connection):
@@ -19,6 +20,8 @@ class UserMetricsService:
def used_quota(self, service_type, date):
if service_type == self.SERVICE_HERE_ISOLINES:
return self.__used_isolines_quota(service_type, date)
elif service_type == self.SERVICE_MAPZEN_ROUTING:
return self.__used_routing_quota(service_type, date)
else:
return self.__used_geocoding_quota(service_type, date)

View File

@@ -0,0 +1,24 @@
from cartodb_services.refactor.storage.redis_connection_config import RedisMetadataConnectionConfigBuilder
from cartodb_services.refactor.storage.redis_connection import RedisConnectionBuilder
from cartodb_services.refactor.storage.redis_config import RedisOrgConfigStorageBuilder
class OrgConfigBackendFactory(object):
"""
This class abstracts the creation of an org configuration backend. It will return
an implementation of the ConfigBackendInterface appropriate to the org, depending
on the environment.
"""
def __init__(self, orgname, environment, server_config_backend):
self._orgname = orgname
self._environment = environment
self._server_config_backend = server_config_backend
def get(self):
if self._environment.is_onpremise:
org_config_backend = self._server_config_backend
else:
redis_metadata_connection_config = RedisMetadataConnectionConfigBuilder(self._server_config_backend).get()
redis_metadata_connection = RedisConnectionBuilder(redis_metadata_connection_config).get()
org_config_backend = RedisOrgConfigStorageBuilder(redis_metadata_connection, self._orgname).get()
return org_config_backend

View File

@@ -0,0 +1,17 @@
from cartodb_services.refactor.tools.redis_mock import RedisConnectionMock
from cartodb_services.refactor.storage.redis_connection_config import RedisMetricsConnectionConfigBuilder
from cartodb_services.refactor.storage.redis_connection import RedisConnectionBuilder
class RedisMetricsConnectionFactory(object):
def __init__(self, environment, server_config_storage):
self._environment = environment
self._server_config_storage = server_config_storage
def get(self):
if self._environment.is_onpremise:
redis_metrics_connection = RedisConnectionMock()
else:
redis_metrics_connection_config = RedisMetricsConnectionConfigBuilder(self._server_config_storage).get()
redis_metrics_connection = RedisConnectionBuilder(redis_metrics_connection_config).get()
return redis_metrics_connection

View File

@@ -0,0 +1,13 @@
from cartodb_services.refactor.storage.server_config import InDbServerConfigStorage
class ServerConfigBackendFactory(object):
"""
This class creates a backend to retrieve server configurations (implementing the ConfigBackendInterface).
At this moment it will always return an InDbServerConfigStorage, but nothing prevents from changing the
implementation. To something that reads from a file, memory or whatever. It is mostly there to keep
the layers separated.
"""
def get(self):
return InDbServerConfigStorage()

View File

@@ -0,0 +1,24 @@
from cartodb_services.refactor.storage.redis_connection_config import RedisMetadataConnectionConfigBuilder
from cartodb_services.refactor.storage.redis_connection import RedisConnectionBuilder
from cartodb_services.refactor.storage.redis_config import RedisUserConfigStorageBuilder
class UserConfigBackendFactory(object):
"""
This class abstracts the creation of a user configuration backend. It will return
an implementation of the ConfigBackendInterface appropriate to the user, depending
on the environment.
"""
def __init__(self, username, environment, server_config_backend):
self._username = username
self._environment = environment
self._server_config_backend = server_config_backend
def get(self):
if self._environment.is_onpremise:
user_config_backend = self._server_config_backend
else:
redis_metadata_connection_config = RedisMetadataConnectionConfigBuilder(self._server_config_backend).get()
redis_metadata_connection = RedisConnectionBuilder(redis_metadata_connection_config).get()
user_config_backend = RedisUserConfigStorageBuilder(redis_metadata_connection, self._username).get()
return user_config_backend

View File

@@ -0,0 +1,2 @@
class ConfigException(Exception):
pass

View File

@@ -0,0 +1,57 @@
class ServerEnvironment(object):
DEVELOPMENT = 'development'
STAGING = 'staging'
PRODUCTION = 'production'
ONPREMISE = 'onpremise'
VALID_ENVIRONMENTS = [
DEVELOPMENT,
STAGING,
PRODUCTION,
ONPREMISE
]
def __init__(self, environment_str):
assert environment_str in self.VALID_ENVIRONMENTS
self._environment_str = environment_str
def __str__(self):
return self._environment_str
@property
def is_development(self):
return self._environment_str == self.DEVELOPMENT
@property
def is_staging(self):
return self._environment_str == self.STAGING
@property
def is_production(self):
return self._environment_str == self.PRODUCTION
@property
def is_onpremise(self):
return self._environment_str == self.ONPREMISE
def __eq__(self, other):
return self._environment_str == other._environment_str
class ServerEnvironmentBuilder(object):
DEFAULT_ENVIRONMENT = ServerEnvironment.DEVELOPMENT
def __init__(self, server_config_storage):
self._server_config_storage = server_config_storage
def get(self):
server_config = self._server_config_storage.get('server_conf')
if not server_config or 'environment' not in server_config:
environment_str = self.DEFAULT_ENVIRONMENT
else:
environment_str = server_config['environment']
return ServerEnvironment(environment_str)

View File

@@ -0,0 +1,11 @@
import abc
class ConfigBackendInterface(object):
"""This is an interface that all config backends must abide to"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def get(self, key):
"""Return a value based on the key supplied from some storage"""
pass

View File

@@ -0,0 +1,112 @@
from dateutil.parser import parse as date_parse
class MapzenGeocoderConfig(object):
"""
Value object that represents the configuration needed to operate the mapzen service.
"""
def __init__(self,
geocoding_quota,
soft_geocoding_limit,
period_end_date,
cost_per_hit,
log_path,
mapzen_api_key,
username,
organization):
self._geocoding_quota = geocoding_quota
self._soft_geocoding_limit = soft_geocoding_limit
self._period_end_date = period_end_date
self._cost_per_hit = cost_per_hit
self._log_path = log_path
self._mapzen_api_key = mapzen_api_key
self._username = username
self._organization = organization
# Kind of generic properties. Note which ones are for actually running the
# service and which ones are needed for quota stuff.
@property
def service_type(self):
return 'geocoder_mapzen'
@property
def provider(self):
return 'mapzen'
@property
def is_high_resolution(self):
return True
@property
def geocoding_quota(self):
return self._geocoding_quota
@property
def soft_geocoding_limit(self):
return self._soft_geocoding_limit
@property
def period_end_date(self):
return self._period_end_date
@property
def cost_per_hit(self):
return self._cost_per_hit
# Server config, TODO: locate where this is actually used
@property
def log_path(self):
return self._log_path
# This is actually the specific one to run requests against the remote endpoitn
@property
def mapzen_api_key(self):
return self._mapzen_api_key
# These two identify the user
@property
def username(self):
return self._username
@property
def organization(self):
return self._organization
# TODO: for BW compat, remove
@property
def google_geocoder(self):
return False
class MapzenGeocoderConfigBuilder(object):
def __init__(self, server_conf, user_conf, org_conf, username, orgname):
self._server_conf = server_conf
self._user_conf = user_conf
self._org_conf = org_conf
self._username = username
self._orgname = orgname
def get(self):
mapzen_server_conf = self._server_conf.get('mapzen_conf')
geocoding_quota = mapzen_server_conf['geocoder']['monthly_quota']
mapzen_api_key = mapzen_server_conf['geocoder']['api_key']
soft_geocoding_limit = self._user_conf.get('soft_geocoding_limit')
cost_per_hit=0
period_end_date_str = self._org_conf.get('period_end_date') or self._user_conf.get('period_end_date')
period_end_date = date_parse(period_end_date_str)
logger_conf = self._server_conf.get('logger_conf')
log_path = logger_conf['geocoder_log_path']
return MapzenGeocoderConfig(geocoding_quota,
soft_geocoding_limit,
period_end_date,
cost_per_hit,
log_path,
mapzen_api_key,
self._username,
self._orgname)

View File

@@ -0,0 +1,12 @@
from ..core.interfaces import ConfigBackendInterface
class InMemoryConfigStorage(ConfigBackendInterface):
def __init__(self, config_hash={}):
self._config_hash = config_hash
def get(self, key):
try:
return self._config_hash[key]
except KeyError:
return None

View File

@@ -0,0 +1,6 @@
from ..core.interfaces import ConfigBackendInterface
class NullConfigStorage(ConfigBackendInterface):
def get(self, key):
return None

View File

@@ -0,0 +1,36 @@
from ..core.interfaces import ConfigBackendInterface
from null_config import NullConfigStorage
class RedisConfigStorage(ConfigBackendInterface):
def __init__(self, connection, config_key):
self._connection = connection
self._config_key = config_key
self._data = None
def get(self, key):
if not self._data:
self._data = self._connection.hgetall(self._config_key)
return self._data[key]
class RedisUserConfigStorageBuilder(object):
def __init__(self, redis_connection, username):
self._redis_connection = redis_connection
self._username = username
def get(self):
return RedisConfigStorage(self._redis_connection, 'rails:users:{0}'.format(self._username))
class RedisOrgConfigStorageBuilder(object):
def __init__(self, redis_connection, orgname):
self._redis_connection = redis_connection
self._orgname = orgname
def get(self):
if self._orgname:
return RedisConfigStorage(self._redis_connection, 'rails:orgs:{0}'.format(self._orgname))
else:
return NullConfigStorage()

View File

@@ -0,0 +1,22 @@
from redis.sentinel import Sentinel
from redis import StrictRedis
class RedisConnectionBuilder():
def __init__(self, connection_config):
self._config = connection_config
def get(self):
if self._config.sentinel_id:
sentinel = Sentinel([(self._config.host,
self._config.port)],
socket_timeout=self._config.timeout)
return sentinel.master_for(self._config.sentinel_id,
socket_timeout=self._config.timeout,
db=self._config.db,
retry_on_timeout=True)
else:
conn = StrictRedis(host=self._config.host, port=self._config.port,
db=self._config.db, retry_on_timeout=True,
socket_timeout=self._config.timeout)
return conn

View File

@@ -0,0 +1,80 @@
from cartodb_services.refactor.config.exceptions import ConfigException
from abc import ABCMeta, abstractmethod
class RedisConnectionConfig(object):
"""
This represents a value object to contain configuration needed to set up
a connection to a redis server.
"""
def __init__(self, host, port, timeout, db, sentinel_id):
self._host = host
self._port = port
self._timeout = timeout
self._db = db
self._sentinel_id = sentinel_id
@property
def host(self):
return self._host
@property
def port(self):
return self._port
@property
def timeout(self):
return self._timeout
@property
def db(self):
return self._db
@property
def sentinel_id(self):
return self._sentinel_id
class RedisConnectionConfigBuilder(object):
__metaclass__ = ABCMeta
DEFAULT_USER_DB = 5
DEFAULT_TIMEOUT = 1.5 # seconds
@abstractmethod
def __init__(self, server_config_storage, config_key):
self._server_config_storage = server_config_storage
self._config_key = config_key
def get(self):
conf = self._server_config_storage.get(self._config_key)
if conf is None:
raise ConfigException("There is no redis configuration defined")
host = conf['redis_host']
port = conf['redis_port']
timeout = conf.get('timeout', self.DEFAULT_TIMEOUT) or self.DEFAULT_TIMEOUT
db = conf.get('redis_db', self.DEFAULT_USER_DB) or self.DEFAULT_USER_DB
sentinel_id = conf.get('sentinel_master_id', None)
return RedisConnectionConfig(host, port, timeout, db, sentinel_id)
class RedisMetadataConnectionConfigBuilder(RedisConnectionConfigBuilder):
def __init__(self, server_config_storage):
super(RedisMetadataConnectionConfigBuilder, self).__init__(
server_config_storage,
'redis_metadata_config'
)
class RedisMetricsConnectionConfigBuilder(RedisConnectionConfigBuilder):
def __init__(self, server_config_storage):
super(RedisMetricsConnectionConfigBuilder, self).__init__(
server_config_storage,
'redis_metrics_config'
)

View File

@@ -0,0 +1,14 @@
import json
import cartodb_services
from ..core.interfaces import ConfigBackendInterface
class InDbServerConfigStorage(ConfigBackendInterface):
def get(self, key):
sql = "SELECT cdb_dataservices_server.cdb_conf_getconf('{0}') as conf".format(key)
rows = cartodb_services.plpy.execute(sql, 1)
json_output = rows[0]['conf']
if json_output:
return json.loads(json_output)
else:
return None

View File

@@ -0,0 +1,52 @@
from cartodb_services.refactor.config.exceptions import ConfigException
class LoggerConfig(object):
"""This class is a value object needed to setup a Logger"""
def __init__(self, server_environment, rollbar_api_key, log_file_path, min_log_level):
self._server_environment = server_environment
self._rollbar_api_key = rollbar_api_key
self._log_file_path = log_file_path
self._min_log_level = min_log_level
@property
def environment(self):
return self._server_environment
@property
def rollbar_api_key(self):
return self._rollbar_api_key
@property
def log_file_path(self):
return self._log_file_path
@property
def min_log_level(self):
return self._min_log_level
# TODO this needs tests
class LoggerConfigBuilder(object):
def __init__(self, environment, server_config_storage):
self._server_environment = environment
self._server_config_storage = server_config_storage
def get(self):
logger_conf = self._server_config_storage.get('logger_conf')
if not logger_conf:
raise ConfigException('Logger configuration missing')
rollbar_api_key = self._get_value_or_none(logger_conf, 'rollbar_api_key')
log_file_path = self._get_value_or_none(logger_conf, 'log_file_path')
min_log_level = self._get_value_or_none(logger_conf, 'min_log_level') or 'warning'
logger_config = LoggerConfig(str(self._server_environment), rollbar_api_key, log_file_path, min_log_level)
return logger_config
def _get_value_or_none(self, logger_conf, key):
value = None
if key in logger_conf:
value = logger_conf[key]
return value

View File

@@ -0,0 +1,8 @@
class RedisConnectionMock(object):
""" Simple class to mock a dummy behaviour for Redis related functions """
def zscore(self, redis_prefix, day):
pass
def zincrby(self, redis_prefix, day, amount):
pass

View File

@@ -1,4 +1,3 @@
import plpy
import rollbar
import logging
import json
@@ -6,7 +5,14 @@ import traceback
import sys
# Monkey patch because plpython sys module doesn't have argv and rollbar
# package use it
sys.__dict__['argv'] = []
if 'argv' not in sys.__dict__:
sys.__dict__['argv'] = []
# Only can be imported when is called from PLPython
try:
import plpy
except ImportError:
pass
class Logger:
@@ -30,30 +36,28 @@ class Logger:
return
self._send_to_rollbar('debug', text, exception, data)
self._send_to_log_file('debug', text, exception, data)
plpy.debug(text)
self._send_to_plpy('debug', text)
def info(self, text, exception=None, data={}):
if not self._check_min_level('info'):
return
self._send_to_rollbar('info', text, exception, data)
self._send_to_log_file('info', text, exception, data)
plpy.info(text)
self._send_to_plpy('info', text)
def warning(self, text, exception=None, data={}):
if not self._check_min_level('warning'):
return
self._send_to_rollbar('warning', text, exception, data)
self._send_to_log_file('warning', text, exception, data)
plpy.warning(text)
self._send_to_plpy('warning', text)
def error(self, text, exception=None, data={}):
if not self._check_min_level('error'):
return
self._send_to_rollbar('error', text, exception, data)
self._send_to_log_file('error', text, exception, data)
# Plpy.error and fatal raises exceptions and we only want to log an
# error, exceptions should be raise explicitly
plpy.warning(text)
self._send_to_plpy('error', text)
def _check_min_level(self, level):
return True if self.LEVELS[level] >= self._min_level else False
@@ -82,6 +86,19 @@ class Logger:
elif level == 'error':
self._file_logger.error(text, extra=extra_data)
def _send_to_plpy(self, level, text):
if self._check_plpy():
if level == 'debug':
plpy.debug(text)
elif level == 'info':
plpy.info(text)
elif level == 'warning':
plpy.warning(text)
elif level == 'error':
# Plpy.error and fatal raises exceptions and we only want to
# log an error, exceptions should be raise explicitly
plpy.warning(text)
def _parse_log_extra_data(self, exception, data):
extra_data = {}
if exception:
@@ -118,6 +135,13 @@ class Logger:
def _log_file_activated(self):
return True if self._config.log_file_path else False
def _check_plpy(self):
try:
module = sys.modules['plpy']
return True
except KeyError:
return False
class ConfigException(Exception):
pass

View File

@@ -10,7 +10,7 @@ from setuptools import setup, find_packages
setup(
name='cartodb_services',
version='0.7.4.2',
version='0.9.4',
description='CartoDB Services API Python Library',

View File

@@ -0,0 +1,67 @@
from unittest import TestCase
from mockredis import MockRedis
from cartodb_services.metrics.config import RoutingConfig, ServicesRedisConfig
from ..test_helper import build_plpy_mock
class TestRoutingConfig(TestCase):
def setUp(self):
self._redis_conn = MockRedis()
self._db_conn = build_plpy_mock()
self._username = 'my_test_user'
self._user_key = "rails:users:{0}".format(self._username)
self._redis_conn.hset(self._user_key, 'period_end_date', '2016-10-10')
def test_should_pick_quota_from_server_by_default(self):
orgname = None
config = RoutingConfig(self._redis_conn, self._db_conn, self._username, orgname)
assert config.monthly_quota == 1500000
def test_should_pick_quota_from_redis_if_present(self):
self._redis_conn.hset(self._user_key, 'mapzen_routing_quota', 1000)
orgname = None
config = RoutingConfig(self._redis_conn, self._db_conn, self._username, orgname)
assert config.monthly_quota == 1000
def test_org_quota_overrides_user_quota(self):
self._redis_conn.hset(self._user_key, 'mapzen_routing_quota', 1000)
orgname = 'my_test_org'
orgname_key = "rails:orgs:{0}".format(orgname)
self._redis_conn.hset(orgname_key, 'period_end_date', '2016-05-31')
self._redis_conn.hset(orgname_key, 'mapzen_routing_quota', 5000)
# TODO: these are not too relevant for the routing config
self._redis_conn.hset(orgname_key, 'geocoding_quota', 0)
self._redis_conn.hset(orgname_key, 'here_isolines_quota', 0)
config = RoutingConfig(self._redis_conn, self._db_conn, self._username, orgname)
assert config.monthly_quota == 5000
def test_should_have_soft_limit_false_by_default(self):
orgname = None
config = RoutingConfig(self._redis_conn, self._db_conn, self._username, orgname)
assert config.soft_limit == False
def test_can_set_soft_limit_in_user_conf(self):
self._redis_conn.hset(self._user_key, 'soft_mapzen_routing_limit', True)
orgname = None
config = RoutingConfig(self._redis_conn, self._db_conn, self._username, orgname)
assert config.soft_limit == True
class TestServicesRedisConfig(TestCase):
def test_it_picks_mapzen_routing_quota_from_redis(self):
redis_conn = MockRedis()
redis_conn.hset('rails:users:my_username', 'mapzen_routing_quota', 42)
redis_config = ServicesRedisConfig(redis_conn).build('my_username', None)
assert 'mapzen_routing_quota' in redis_config
assert int(redis_config['mapzen_routing_quota']) == 42
def test_org_quota_overrides_user_quota(self):
redis_conn = MockRedis()
redis_conn.hset('rails:users:my_username', 'mapzen_routing_quota', 42)
redis_conn.hset('rails:orgs:acme', 'mapzen_routing_quota', 31415)
redis_config = ServicesRedisConfig(redis_conn).build('my_username', 'acme')
assert 'mapzen_routing_quota' in redis_config
assert int(redis_config['mapzen_routing_quota']) == 31415

View File

@@ -0,0 +1,85 @@
from unittest import TestCase
from mockredis import MockRedis
from ..test_helper import build_plpy_mock
from cartodb_services.metrics.quota import QuotaChecker
from cartodb_services.metrics import RoutingConfig
from datetime import datetime
class RoutingConfigMock(object):
def __init__(self, **kwargs):
self.__dict__ = kwargs
class TestQuotaChecker(TestCase):
def setUp(self):
self.username = 'my_test_user'
self.period_end_date = datetime.today()
self.service_type = 'routing_mapzen'
self.redis_key = 'user:{0}:{1}:success_responses:{2}{3}'.format(
self.username,
self.service_type,
self.period_end_date.year,
self.period_end_date.month
)
def test_routing_quota_check_passes_when_enough_quota(self):
user_service_config = RoutingConfigMock(
username = self.username,
organization = None,
service_type = self.service_type,
monthly_quota = 1000,
period_end_date = datetime.today(),
soft_limit = False
)
redis_conn = MockRedis()
redis_conn.zincrby(self.redis_key, self.period_end_date.day, 999)
assert QuotaChecker(user_service_config, redis_conn).check() == True
def test_routing_quota_check_fails_when_quota_exhausted(self):
user_service_config = RoutingConfigMock(
username = self.username,
organization = None,
service_type = self.service_type,
monthly_quota = 1000,
period_end_date = datetime.today(),
soft_limit = False
)
redis_conn = MockRedis()
redis_conn.zincrby(self.redis_key, self.period_end_date.day, 1001)
checker = QuotaChecker(user_service_config, redis_conn)
assert checker.check() == False
def test_routing_quota_check_fails_right_in_the_limit(self):
"""
I have 1000 credits and I just spent 1000 today. I should not pass
the check to perform the 1001th routing operation.
"""
user_service_config = RoutingConfigMock(
username = self.username,
organization = None,
service_type = self.service_type,
monthly_quota = 1000,
period_end_date = datetime.today(),
soft_limit = False
)
redis_conn = MockRedis()
redis_conn.zincrby(self.redis_key, self.period_end_date.day, 1000)
checker = QuotaChecker(user_service_config, redis_conn)
assert checker.check() == False
def test_routing_quota_check_passes_if_no_quota_but_soft_limit(self):
user_service_config = RoutingConfigMock(
username = self.username,
organization = None,
service_type = self.service_type,
monthly_quota = 1000,
period_end_date = datetime.today(),
soft_limit = True
)
redis_conn = MockRedis()
redis_conn.zincrby(self.redis_key, self.period_end_date.day, 1001)
checker = QuotaChecker(user_service_config, redis_conn)
assert checker.check() == True

View File

@@ -0,0 +1,30 @@
from unittest import TestCase
from cartodb_services.metrics import UserMetricsService
import datetime
from mockredis import MockRedis
class UserGeocoderConfig(object):
def __init__(self, **kwargs):
self.__dict__ = kwargs
class TestUserMetricsService(TestCase):
def setUp(self):
user_geocoder_config = UserGeocoderConfig(
username = 'my_test_user',
organization = None,
period_end_date = datetime.date.today()
)
redis_conn = MockRedis()
self.user_metrics_service = UserMetricsService(user_geocoder_config, redis_conn)
def test_routing_used_quota_zero_when_no_usage(self):
assert self.user_metrics_service.used_quota(UserMetricsService.SERVICE_MAPZEN_ROUTING, datetime.date.today()) == 0
def test_routing_used_quota_counts_usages(self):
self.user_metrics_service.increment_service_use(UserMetricsService.SERVICE_MAPZEN_ROUTING, 'success_responses')
self.user_metrics_service.increment_service_use(UserMetricsService.SERVICE_MAPZEN_ROUTING, 'empty_responses')
assert self.user_metrics_service.used_quota('routing_mapzen', datetime.date.today()) == 2

View File

@@ -0,0 +1,47 @@
from unittest import TestCase
from cartodb_services.refactor.core.environment import *
from nose.tools import raises
from cartodb_services.refactor.storage.mem_config import InMemoryConfigStorage
class TestServerEnvironment(TestCase):
def test_can_be_a_valid_one(self):
env_dev = ServerEnvironment('development')
env_staging = ServerEnvironment('staging')
env_prod = ServerEnvironment('production')
env_onpremise = ServerEnvironment('onpremise')
@raises(AssertionError)
def test_cannot_be_a_non_valid_one(self):
env_whatever = ServerEnvironment('whatever')
def test_is_on_premise_returns_true_when_onpremise(self):
assert ServerEnvironment('onpremise').is_onpremise == True
def test_is_on_premise_returns_true_when_any_other(self):
assert ServerEnvironment('development').is_onpremise == False
assert ServerEnvironment('staging').is_onpremise == False
assert ServerEnvironment('production').is_onpremise == False
def test_equality(self):
assert ServerEnvironment('development') == ServerEnvironment('development')
assert ServerEnvironment('development') <> ServerEnvironment('onpremise')
class TestServerEnvironmentBuilder(TestCase):
def test_returns_env_according_to_configuration(self):
server_config_storage = InMemoryConfigStorage({
'server_conf': {
'environment': 'staging'
}
})
server_env = ServerEnvironmentBuilder(server_config_storage).get()
assert server_env.is_staging == True
def test_returns_default_when_no_server_conf(self):
server_config_storage = InMemoryConfigStorage({})
server_env = ServerEnvironmentBuilder(server_config_storage).get()
assert server_env.is_development == True
assert str(server_env) == ServerEnvironmentBuilder.DEFAULT_ENVIRONMENT

View File

@@ -0,0 +1,12 @@
from unittest import TestCase
from cartodb_services.refactor.storage.mem_config import InMemoryConfigStorage
class TestInMemoryConfigStorage(TestCase):
def test_can_provide_values_from_hash(self):
server_config = InMemoryConfigStorage({'any_key': 'any_value'})
assert server_config.get('any_key') == 'any_value'
def test_gets_none_if_cannot_retrieve_key(self):
server_config = InMemoryConfigStorage()
assert server_config.get('any_non_existing_key') == None

View File

@@ -0,0 +1,14 @@
from unittest import TestCase
from cartodb_services.refactor.storage.null_config import NullConfigStorage
from cartodb_services.refactor.core.interfaces import ConfigBackendInterface
class TestNullConfigStorage(TestCase):
def test_is_a_config_backend(self):
null_config = NullConfigStorage()
assert isinstance(null_config, ConfigBackendInterface)
def test_returns_none_regardless_of_input(self):
null_config = NullConfigStorage()
assert null_config.get('whatever') is None

View File

@@ -0,0 +1,77 @@
from unittest import TestCase
from cartodb_services.refactor.storage.redis_config import *
from mockredis import MockRedis
from mock import Mock, MagicMock
from nose.tools import raises
class TestRedisConfigStorage(TestCase):
CONFIG_HASH_KEY = 'mykey'
def test_can_get_a_config_field(self):
connection = MockRedis()
connection.hset(self.CONFIG_HASH_KEY, 'field1', 42)
redis_config = RedisConfigStorage(connection, self.CONFIG_HASH_KEY)
value = redis_config.get('field1')
assert type(value) == str # this is something to take into account, redis always returns strings
assert value == '42'
@raises(KeyError)
def test_raises_an_exception_if_config_key_not_present(self):
connection = MockRedis()
redis_config = RedisConfigStorage(connection, self.CONFIG_HASH_KEY)
redis_config.get('whatever_field')
@raises(KeyError)
def test_returns_nothing_if_field_not_present(self):
connection = MockRedis()
connection.hmset(self.CONFIG_HASH_KEY, {'field1': 42, 'field2': 43})
redis_config = RedisConfigStorage(connection, self.CONFIG_HASH_KEY)
redis_config.get('whatever_field')
def test_it_reads_the_config_hash_just_once(self):
connection = Mock()
connection.hgetall = MagicMock(return_value={'field1': '42'})
redis_config = RedisConfigStorage(connection, self.CONFIG_HASH_KEY)
assert redis_config.get('field1') == '42'
assert redis_config.get('field1') == '42'
connection.hgetall.assert_called_once_with(self.CONFIG_HASH_KEY)
class TestRedisUserConfigStorageBuilder(TestCase):
USERNAME = 'john'
EXPECTED_REDIS_CONFIG_HASH_KEY = 'rails:users:john'
def test_it_reads_the_correct_hash_key(self):
connection = Mock()
connection.hgetall = MagicMock(return_value={'an_user_config_field': 'nice'})
redis_config = RedisConfigStorage(connection, self.EXPECTED_REDIS_CONFIG_HASH_KEY)
redis_config = RedisUserConfigStorageBuilder(connection, self.USERNAME).get()
assert redis_config.get('an_user_config_field') == 'nice'
connection.hgetall.assert_called_once_with(self.EXPECTED_REDIS_CONFIG_HASH_KEY)
class TestRedisOrgConfigStorageBuilder(TestCase):
ORGNAME = 'smith'
EXPECTED_REDIS_CONFIG_HASH_KEY = 'rails:orgs:smith'
def test_it_reads_the_correct_hash_key(self):
connection = Mock()
connection.hgetall = MagicMock(return_value={'an_org_config_field': 'awesome'})
redis_config = RedisConfigStorage(connection, self.EXPECTED_REDIS_CONFIG_HASH_KEY)
redis_config = RedisOrgConfigStorageBuilder(connection, self.ORGNAME).get()
assert redis_config.get('an_org_config_field') == 'awesome'
connection.hgetall.assert_called_once_with(self.EXPECTED_REDIS_CONFIG_HASH_KEY)
def test_it_returns_a_null_config_storage_if_theres_no_orgname(self):
redis_config = RedisOrgConfigStorageBuilder(None, None).get()
assert type(redis_config) == NullConfigStorage
assert redis_config.get('whatever') == None

View File

@@ -0,0 +1,115 @@
from unittest import TestCase
from cartodb_services.refactor.storage.redis_connection_config import *
from cartodb_services.refactor.storage.mem_config import InMemoryConfigStorage
from cartodb_services.refactor.config.exceptions import ConfigException
class TestRedisConnectionConfig(TestCase):
def test_config_holds_values(self):
# this is mostly for completeness, dummy class, dummy test
config = RedisConnectionConfig('myhost.com', 6379, 0.1, 5, None)
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == 0.1
assert config.db == 5
assert config.sentinel_id is None
class TestRedisConnectionConfigBuilder(TestCase):
def test_it_raises_exception_as_it_is_abstract(self):
server_config_storage = InMemoryConfigStorage()
self.assertRaises(TypeError, RedisConnectionConfigBuilder, server_config_storage, 'whatever_key')
class TestRedisMetadataConnectionConfigBuilder(TestCase):
def test_it_raises_exception_if_config_is_missing(self):
server_config_storage = InMemoryConfigStorage()
config_builder = RedisMetadataConnectionConfigBuilder(server_config_storage)
self.assertRaises(ConfigException, config_builder.get)
def test_it_gets_a_valid_config_from_the_server_storage(self):
server_config_storage = InMemoryConfigStorage({
'redis_metadata_config': {
'redis_host': 'myhost.com',
'redis_port': 6379,
'timeout': 0.2,
'redis_db': 3,
'sentinel_master_id': None
}
})
config = RedisMetadataConnectionConfigBuilder(server_config_storage).get()
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == 0.2
assert config.db == 3
assert config.sentinel_id is None
def test_it_gets_a_default_timeout_if_none(self):
server_config_storage = InMemoryConfigStorage({
'redis_metadata_config': {
'redis_host': 'myhost.com',
'redis_port': 6379,
'timeout': None,
'redis_db': 3,
'sentinel_master_id': None
}
})
config = RedisMetadataConnectionConfigBuilder(server_config_storage).get()
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == RedisConnectionConfigBuilder.DEFAULT_TIMEOUT
assert config.db == 3
assert config.sentinel_id is None
def test_it_gets_a_default_db_if_none(self):
server_config_storage = InMemoryConfigStorage({
'redis_metadata_config': {
'redis_host': 'myhost.com',
'redis_port': 6379,
'timeout': 0.2,
'redis_db': None,
'sentinel_master_id': None
}
})
config = RedisMetadataConnectionConfigBuilder(server_config_storage).get()
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == 0.2
assert config.db == RedisConnectionConfigBuilder.DEFAULT_USER_DB
assert config.sentinel_id is None
class TestRedisMetricsConnectionConfigBuilder(TestCase):
def test_it_gets_a_valid_config_from_the_server_storage(self):
server_config_storage = InMemoryConfigStorage({
'redis_metrics_config': {
'redis_host': 'myhost.com',
'redis_port': 6379,
'timeout': 0.2,
'redis_db': 3,
'sentinel_master_id': 'some_master_id'
}
})
config = RedisMetricsConnectionConfigBuilder(server_config_storage).get()
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == 0.2
assert config.db == 3
assert config.sentinel_id == 'some_master_id'
def test_it_sets_absent_values_to_none_or_defaults(self):
server_config_storage = InMemoryConfigStorage({
'redis_metrics_config': {
'redis_host': 'myhost.com',
'redis_port': 6379,
}
})
config = RedisMetricsConnectionConfigBuilder(server_config_storage).get()
assert config.host == 'myhost.com'
assert config.port == 6379
assert config.timeout == 1.5
assert config.db == 5
assert config.sentinel_id is None

View File

@@ -0,0 +1,31 @@
from unittest import TestCase
from mock import Mock, MagicMock
from nose.tools import raises
from cartodb_services.refactor.storage.server_config import *
import cartodb_services
class TestInDbServerConfigStorage(TestCase):
def setUp(self):
self.plpy_mock = Mock()
cartodb_services.init(self.plpy_mock, _GD={})
def tearDown(self):
cartodb_services._reset()
def test_gets_configs_from_db(self):
self.plpy_mock.execute = MagicMock(return_value=[{'conf': '"any_value"'}])
server_config = InDbServerConfigStorage()
assert server_config.get('any_config') == 'any_value'
self.plpy_mock.execute.assert_called_once_with("SELECT cdb_dataservices_server.cdb_conf_getconf('any_config') as conf", 1)
def test_gets_none_if_cannot_retrieve_key(self):
self.plpy_mock.execute = MagicMock(return_value=[{'conf': None}])
server_config = InDbServerConfigStorage()
assert server_config.get('any_non_existing_key') is None
def test_deserializes_from_db_to_plain_dict(self):
self.plpy_mock.execute = MagicMock(return_value=[{'conf': '{"environment": "testing"}'}])
server_config = InDbServerConfigStorage()
assert server_config.get('server_conf') == {'environment': 'testing'}
self.plpy_mock.execute.assert_called_once_with("SELECT cdb_dataservices_server.cdb_conf_getconf('server_conf') as conf", 1)

View File

@@ -6,7 +6,7 @@ import requests_mock
from mock import Mock
from cartodb_services.mapzen import MapzenGeocoder
from cartodb_services.mapzen.exceptions import MalformedResult
from cartodb_services.mapzen.exceptions import MalformedResult, TimeoutException
requests_mock.Mocker.TEST_PREFIX = 'test_'

View File

@@ -0,0 +1,33 @@
import test_helper
import requests
from unittest import TestCase
from nose.tools import assert_raises
from datetime import datetime, date
from cartodb_services.mapzen.qps import qps_retry
from cartodb_services.mapzen.exceptions import ServiceException, TimeoutException
import requests_mock
import mock
requests_mock.Mocker.TEST_PREFIX = 'test_'
@requests_mock.Mocker()
class TestQPS(TestCase):
QPS_ERROR_MESSAGE = "Queries per second exceeded: Queries exceeded (10 allowed)"
def test_qps_timeout(self, req_mock):
class TestClass:
@qps_retry(timeout=0.001, qps=100)
def test(self):
response = requests.get('http://localhost/test_qps')
if response.status_code == 429:
raise ServiceException('Error 429', response)
def _text_cb(request, context):
context.status_code = 429
return self.QPS_ERROR_MESSAGE
req_mock.register_uri('GET', 'http://localhost/test_qps',
text=_text_cb)
with self.assertRaises(TimeoutException):
c = TestClass()
c.test()