Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
102 changes: 102 additions & 0 deletions clickhouse-cloud-alicloud/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
ApsaraDB for ClickHouse Enterprise Edition(ClickHouse Cloud on Alibaba Cloud) is a cloud service that is developed based on open source ClickHouse. The architecture and features of ApsaraDB for ClickHouse Enterprise Edition differ from those of open source ClickHouse.

To evaluate the performance of ClickHouse Cloud on Alibaba Cloud, follow these guidelines to set up and execute the benchmark tests.

1. **Instance Purchase**:
Purchase a ApsaraDB for ClickHouse Enterprise Edition cluster instance and an ECS instance from Alibaba Cloud. Both instances must be in the same region for optimal network performance. For example, you can choose instances from the Hangzhou region.

- **ClickHouse Instance**: Refer to the [ApsaraDB for ClickHouse Purchase Guide](https://www.alibabacloud.com/help/en/clickhouse/create-a-cluster?spm=a2c63.p38356.help-menu-144466.d_1_3.48341f55GpqImZ) for details on creating a ClickHouse cluster.
- **Edition**: Select **Enterprise Edition** when creating the instance
- **Storage Type**: Choose **ESSD_L1(ON AFS, ADB File System)** for optimal performance
- **Compute Group**: For single-node testing, create a **single-node compute group**
- **ECS Instance**: Refer to the [ECS Instance Purchase Guide](https://www.alibabacloud.com/help/en/ecs/user-guide/create-an-instance-by-using-the-wizard) for details on creating an ECS instance.

2. **Access Key Setup**:
Create your Alibaba Cloud Access Key (AK, SK) by following the instructions in the [RAM User Guide](https://www.alibabacloud.com/help/en/ram/user-guide/create-an-accesskey-pair). You will need these credentials for data preparation and benchmark execution.

3. **Data Preparation**:
On your ECS instance, execute the `download.sh` script to download the ClickBench test dataset and upload it to OSS:

```bash
export AK=your_access_key
export SK=your_secret_key
export OSS_ENDPOINT=oss-cn-hangzhou-internal.aliyuncs.com
export OSS_PATH=oss://your-bucket-name/clickbench/hits_parquets/
./download.sh
```

This script will:
- Download the latest ClickBench dataset from the official source (100 parquet files)
- Upload the parquet files to your OSS bucket for use in the benchmark

**Note**: Make sure to replace `your-bucket-name` with your actual OSS bucket name and adjust the endpoint region if needed.

4. **ClickHouse Instance Configuration**:
Before running the benchmark, configure your ClickHouse instance:

- **Create Test User**: Create a database user account for running the benchmark tests
- **Whitelist ECS IP**: Add your ECS instance's internal IP address to the ClickHouse instance whitelist to allow connections

Refer to the [ClickHouse Security Configuration Guide](https://www.alibabacloud.com/help/en/clickhouse/user-guide/configure-ip-address-whitelists) for detailed instructions.

5. **Environment Variables Setup**:
Set up the following environment variables on your ECS instance:

```bash
export FQDN=your_clickhouse_host # ClickHouse instance FQDN
export USER=user_name # ClickHouse username
export PASSWORD=your_password # ClickHouse password
export AK=access_key # Alibaba Cloud Access Key ID
export SK=secret_key # Alibaba Cloud Secret Access Key
export STORAGE=afs # Storage type identifier (e.g., afs, oss)
export REPLICAS=replicas_num # Number of replicas
export CCU=your_ccu # Compute Capacity Units (CCUs)
export ECS=ECS_instance_generation
export OSS_URL="https://your-bucket-name.oss-cn-hangzhou-internal.aliyuncs.com/clickbench/hits_parquets/hits_{0..99}.parquet"
```

**Important**:
- Replace `your-bucket-name` with your actual OSS bucket name
- Ensure the OSS_URL matches the path where you uploaded the data in step 3
- Use the internal endpoint for better performance and no data transfer fees

6. **Benchmark Execution**:
Execute the `benchmark.sh` script to run the complete benchmark test:

```bash
./benchmark.sh
```

7. **Complete Example**:
Here's a complete workflow example:

```bash
# Step 1: Download and upload test data to OSS
export AK=LTAI5txxxxxxxxxx
export SK=xxxxxxxxxxxxxxxx
export OSS_ENDPOINT=oss-cn-hangzhou-internal.aliyuncs.com
export OSS_PATH=oss://clickhouse-test-bucket/clickbench/hits_parquets/
./download.sh

# Step 2: Configure ClickHouse instance (via web console)
# - Create test user (or use default user)
# - Whitelist ECS internal IP address

# Step 3: Set environment variables and run benchmark
export FQDN=xxxxx.clickhouse.aliyuncs.com
export USER=default
export PASSWORD=YourPassword123
export AK=LTAI5txxxxxxxxxx
export SK=xxxxxxxxxxxxxxxx
export STORAGE=afs
export REPLICAS=2
export CCU=32
export ECS=r8i
export OSS_URL="https://clickhouse-test-bucket.oss-cn-hangzhou-internal.aliyuncs.com/clickbench/hits_parquets/hits_{0..99}.parquet"

./benchmark.sh
```

8. **Results**:
The benchmark results will be saved in the `results/` directory in JSON format, with filenames following the pattern: `alicloud-{CATEGORY}-{REPLICAS}-{STORAGE}-{CCU}-{REPLICAS}.json`

48 changes: 48 additions & 0 deletions clickhouse-cloud-alicloud/benchmark.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
#!/bin/bash -e

# Load the data

# export FQDN=...
# export USER=...
# export PASSWORD=...
# export AK=...
# export SK=...
# export STORAGE=...
# export REPLICAS=...
# export CCU=...
# export ECS=...
# export OSS_URL=..., eg. "https://clickhouse-test-clickbench-hangzhou.oss-cn-hangzhou-internal.aliyuncs.com/clickbench/hits_parquets/hits_{0..99}.parquet"

export CATEGORY="enterprise" # enterprise or community

MAX_INSERT_THREADS=$(clickhouse-client --host "$FDQN" --user "$USER" --password "$PASSWORD" --query "SELECT intDiv(getSetting('max_threads'), 4)")

load_time=$(clickhouse-client --host "$FDQN" --user "$USER" --password "$PASSWORD" --enable-parallel-replicas 1 --max-insert-threads $MAX_INSERT_THREADS --query="INSERT INTO hits SELECT * FROM s3Cluster('default', '$OSS_URL', '$AK', '$SK', 'parquet');" --time 2>&1)


result=$(./run.sh)

data_size=$(clickhouse-client --host "$FDQN" --user "$USER" --password "$PASSWORD" --query="SELECT total_bytes FROM system.tables WHERE name = 'hits' AND database = 'default'")

echo '
{
"system": "ApsaraDB for ClickHouse('$CATEGORY', '$STORAGE')",
"date": "'$(date +%F)'",
"machine": "AliCloud: '$CCU'CCU, '$ECS'",
"cluster_size": '$REPLICAS',

"proprietary": "yes",
"hardware": "cpu",
"tuned": "no",
"comment": "",

"tags": ["C++", "column-oriented", "ClickHouse derivative", "managed", "alicloud"],

"load_time": '$load_time',
"data_size": '$data_size',

"result": [
'$(echo "$result" | sed '$ s/.$//')'
]
}
' > "results/alicloud-$CATEGORY-$REPLICAS-$STORAGE-$CCU-$REPLICAS.json"
109 changes: 109 additions & 0 deletions clickhouse-cloud-alicloud/create.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
CREATE TABLE hits
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be cool if create.sql and queries.sql were symlinks to the corresponding files in the clickhouse/ older.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does not matter.

(
WatchID BIGINT NOT NULL,
JavaEnable SMALLINT NOT NULL,
Title TEXT NOT NULL,
GoodEvent SMALLINT NOT NULL,
EventTime TIMESTAMP NOT NULL,
EventDate Date NOT NULL,
CounterID INTEGER NOT NULL,
ClientIP INTEGER NOT NULL,
RegionID INTEGER NOT NULL,
UserID BIGINT NOT NULL,
CounterClass SMALLINT NOT NULL,
OS SMALLINT NOT NULL,
UserAgent SMALLINT NOT NULL,
URL TEXT NOT NULL,
Referer TEXT NOT NULL,
IsRefresh SMALLINT NOT NULL,
RefererCategoryID SMALLINT NOT NULL,
RefererRegionID INTEGER NOT NULL,
URLCategoryID SMALLINT NOT NULL,
URLRegionID INTEGER NOT NULL,
ResolutionWidth SMALLINT NOT NULL,
ResolutionHeight SMALLINT NOT NULL,
ResolutionDepth SMALLINT NOT NULL,
FlashMajor SMALLINT NOT NULL,
FlashMinor SMALLINT NOT NULL,
FlashMinor2 TEXT NOT NULL,
NetMajor SMALLINT NOT NULL,
NetMinor SMALLINT NOT NULL,
UserAgentMajor SMALLINT NOT NULL,
UserAgentMinor VARCHAR(255) NOT NULL,
CookieEnable SMALLINT NOT NULL,
JavascriptEnable SMALLINT NOT NULL,
IsMobile SMALLINT NOT NULL,
MobilePhone SMALLINT NOT NULL,
MobilePhoneModel TEXT NOT NULL,
Params TEXT NOT NULL,
IPNetworkID INTEGER NOT NULL,
TraficSourceID SMALLINT NOT NULL,
SearchEngineID SMALLINT NOT NULL,
SearchPhrase TEXT NOT NULL,
AdvEngineID SMALLINT NOT NULL,
IsArtifical SMALLINT NOT NULL,
WindowClientWidth SMALLINT NOT NULL,
WindowClientHeight SMALLINT NOT NULL,
ClientTimeZone SMALLINT NOT NULL,
ClientEventTime TIMESTAMP NOT NULL,
SilverlightVersion1 SMALLINT NOT NULL,
SilverlightVersion2 SMALLINT NOT NULL,
SilverlightVersion3 INTEGER NOT NULL,
SilverlightVersion4 SMALLINT NOT NULL,
PageCharset TEXT NOT NULL,
CodeVersion INTEGER NOT NULL,
IsLink SMALLINT NOT NULL,
IsDownload SMALLINT NOT NULL,
IsNotBounce SMALLINT NOT NULL,
FUniqID BIGINT NOT NULL,
OriginalURL TEXT NOT NULL,
HID INTEGER NOT NULL,
IsOldCounter SMALLINT NOT NULL,
IsEvent SMALLINT NOT NULL,
IsParameter SMALLINT NOT NULL,
DontCountHits SMALLINT NOT NULL,
WithHash SMALLINT NOT NULL,
HitColor CHAR NOT NULL,
LocalEventTime TIMESTAMP NOT NULL,
Age SMALLINT NOT NULL,
Sex SMALLINT NOT NULL,
Income SMALLINT NOT NULL,
Interests SMALLINT NOT NULL,
Robotness SMALLINT NOT NULL,
RemoteIP INTEGER NOT NULL,
WindowName INTEGER NOT NULL,
OpenerName INTEGER NOT NULL,
HistoryLength SMALLINT NOT NULL,
BrowserLanguage TEXT NOT NULL,
BrowserCountry TEXT NOT NULL,
SocialNetwork TEXT NOT NULL,
SocialAction TEXT NOT NULL,
HTTPError SMALLINT NOT NULL,
SendTiming INTEGER NOT NULL,
DNSTiming INTEGER NOT NULL,
ConnectTiming INTEGER NOT NULL,
ResponseStartTiming INTEGER NOT NULL,
ResponseEndTiming INTEGER NOT NULL,
FetchTiming INTEGER NOT NULL,
SocialSourceNetworkID SMALLINT NOT NULL,
SocialSourcePage TEXT NOT NULL,
ParamPrice BIGINT NOT NULL,
ParamOrderID TEXT NOT NULL,
ParamCurrency TEXT NOT NULL,
ParamCurrencyID SMALLINT NOT NULL,
OpenstatServiceName TEXT NOT NULL,
OpenstatCampaignID TEXT NOT NULL,
OpenstatAdID TEXT NOT NULL,
OpenstatSourceID TEXT NOT NULL,
UTMSource TEXT NOT NULL,
UTMMedium TEXT NOT NULL,
UTMCampaign TEXT NOT NULL,
UTMContent TEXT NOT NULL,
UTMTerm TEXT NOT NULL,
FromTag TEXT NOT NULL,
HasGCLID SMALLINT NOT NULL,
RefererHash BIGINT NOT NULL,
URLHash BIGINT NOT NULL,
CLID INTEGER NOT NULL,
PRIMARY KEY (CounterID, EventDate, UserID, EventTime, WatchID)
);
21 changes: 21 additions & 0 deletions clickhouse-cloud-alicloud/download.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/bin/bash

# export OSS_ENDPOINT=..., eg. oss-cn-hangzhou-internal.aliyuncs.com
# export OSS_PATH=..., eg. oss://clickhouse-test-clickbench-hangzhou/clickbench/hits_parquets/
# export AK=...
# export SK=...

mkdir hits_parquets

# Remove existing data packages in the current path and get the latest data from the official website
# rm hits_parquets/*

# Download the latest data package
cd ./hits_parquets
wget https://datasets.clickhouse.com/hits_compatible/athena_partitioned/hits_{0..99}.parquet

# Remove historical Clickbench data
# ossutil rm -i $AK -k $SK --endpoint $OSS_ENDPOINT $OSS_PATH

# Upload the latest test dataset package
ossutil cp -r -i $AK -k $SK --endpoint $OSS_ENDPOINT ./hits_parquets $OSS_PATH
43 changes: 43 additions & 0 deletions clickhouse-cloud-alicloud/queries.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
SELECT COUNT(*) FROM hits;
SELECT COUNT(*) FROM hits WHERE AdvEngineID <> 0;
SELECT SUM(AdvEngineID), COUNT(*), AVG(ResolutionWidth) FROM hits;
SELECT AVG(UserID) FROM hits;
SELECT COUNT(DISTINCT UserID) FROM hits;
SELECT COUNT(DISTINCT SearchPhrase) FROM hits;
SELECT MIN(EventDate), MAX(EventDate) FROM hits;
SELECT AdvEngineID, COUNT(*) FROM hits WHERE AdvEngineID <> 0 GROUP BY AdvEngineID ORDER BY COUNT(*) DESC;
SELECT RegionID, COUNT(DISTINCT UserID) AS u FROM hits GROUP BY RegionID ORDER BY u DESC LIMIT 10;
SELECT RegionID, SUM(AdvEngineID), COUNT(*) AS c, AVG(ResolutionWidth), COUNT(DISTINCT UserID) FROM hits GROUP BY RegionID ORDER BY c DESC LIMIT 10;
SELECT MobilePhoneModel, COUNT(DISTINCT UserID) AS u FROM hits WHERE MobilePhoneModel <> '' GROUP BY MobilePhoneModel ORDER BY u DESC LIMIT 10;
SELECT MobilePhone, MobilePhoneModel, COUNT(DISTINCT UserID) AS u FROM hits WHERE MobilePhoneModel <> '' GROUP BY MobilePhone, MobilePhoneModel ORDER BY u DESC LIMIT 10;
SELECT SearchPhrase, COUNT(*) AS c FROM hits WHERE SearchPhrase <> '' GROUP BY SearchPhrase ORDER BY c DESC LIMIT 10;
SELECT SearchPhrase, COUNT(DISTINCT UserID) AS u FROM hits WHERE SearchPhrase <> '' GROUP BY SearchPhrase ORDER BY u DESC LIMIT 10;
SELECT SearchEngineID, SearchPhrase, COUNT(*) AS c FROM hits WHERE SearchPhrase <> '' GROUP BY SearchEngineID, SearchPhrase ORDER BY c DESC LIMIT 10;
SELECT UserID, COUNT(*) FROM hits GROUP BY UserID ORDER BY COUNT(*) DESC LIMIT 10;
SELECT UserID, SearchPhrase, COUNT(*) FROM hits GROUP BY UserID, SearchPhrase ORDER BY COUNT(*) DESC LIMIT 10;
SELECT UserID, SearchPhrase, COUNT(*) FROM hits GROUP BY UserID, SearchPhrase LIMIT 10;
SELECT UserID, extract(minute FROM EventTime) AS m, SearchPhrase, COUNT(*) FROM hits GROUP BY UserID, m, SearchPhrase ORDER BY COUNT(*) DESC LIMIT 10;
SELECT UserID FROM hits WHERE UserID = 435090932899640449;
SELECT COUNT(*) FROM hits WHERE URL LIKE '%google%';
SELECT SearchPhrase, MIN(URL), COUNT(*) AS c FROM hits WHERE URL LIKE '%google%' AND SearchPhrase <> '' GROUP BY SearchPhrase ORDER BY c DESC LIMIT 10;
SELECT SearchPhrase, MIN(URL), MIN(Title), COUNT(*) AS c, COUNT(DISTINCT UserID) FROM hits WHERE Title LIKE '%Google%' AND URL NOT LIKE '%.google.%' AND SearchPhrase <> '' GROUP BY SearchPhrase ORDER BY c DESC LIMIT 10;
SELECT * FROM hits WHERE URL LIKE '%google%' ORDER BY EventTime LIMIT 10;
SELECT SearchPhrase FROM hits WHERE SearchPhrase <> '' ORDER BY EventTime LIMIT 10;
SELECT SearchPhrase FROM hits WHERE SearchPhrase <> '' ORDER BY SearchPhrase LIMIT 10;
SELECT SearchPhrase FROM hits WHERE SearchPhrase <> '' ORDER BY EventTime, SearchPhrase LIMIT 10;
SELECT CounterID, AVG(length(URL)) AS l, COUNT(*) AS c FROM hits WHERE URL <> '' GROUP BY CounterID HAVING COUNT(*) > 100000 ORDER BY l DESC LIMIT 25;
SELECT REGEXP_REPLACE(Referer, '^https?://(?:www\.)?([^/]+)/.*$', '\1') AS k, AVG(length(Referer)) AS l, COUNT(*) AS c, MIN(Referer) FROM hits WHERE Referer <> '' GROUP BY k HAVING COUNT(*) > 100000 ORDER BY l DESC LIMIT 25;
SELECT SUM(ResolutionWidth), SUM(ResolutionWidth + 1), SUM(ResolutionWidth + 2), SUM(ResolutionWidth + 3), SUM(ResolutionWidth + 4), SUM(ResolutionWidth + 5), SUM(ResolutionWidth + 6), SUM(ResolutionWidth + 7), SUM(ResolutionWidth + 8), SUM(ResolutionWidth + 9), SUM(ResolutionWidth + 10), SUM(ResolutionWidth + 11), SUM(ResolutionWidth + 12), SUM(ResolutionWidth + 13), SUM(ResolutionWidth + 14), SUM(ResolutionWidth + 15), SUM(ResolutionWidth + 16), SUM(ResolutionWidth + 17), SUM(ResolutionWidth + 18), SUM(ResolutionWidth + 19), SUM(ResolutionWidth + 20), SUM(ResolutionWidth + 21), SUM(ResolutionWidth + 22), SUM(ResolutionWidth + 23), SUM(ResolutionWidth + 24), SUM(ResolutionWidth + 25), SUM(ResolutionWidth + 26), SUM(ResolutionWidth + 27), SUM(ResolutionWidth + 28), SUM(ResolutionWidth + 29), SUM(ResolutionWidth + 30), SUM(ResolutionWidth + 31), SUM(ResolutionWidth + 32), SUM(ResolutionWidth + 33), SUM(ResolutionWidth + 34), SUM(ResolutionWidth + 35), SUM(ResolutionWidth + 36), SUM(ResolutionWidth + 37), SUM(ResolutionWidth + 38), SUM(ResolutionWidth + 39), SUM(ResolutionWidth + 40), SUM(ResolutionWidth + 41), SUM(ResolutionWidth + 42), SUM(ResolutionWidth + 43), SUM(ResolutionWidth + 44), SUM(ResolutionWidth + 45), SUM(ResolutionWidth + 46), SUM(ResolutionWidth + 47), SUM(ResolutionWidth + 48), SUM(ResolutionWidth + 49), SUM(ResolutionWidth + 50), SUM(ResolutionWidth + 51), SUM(ResolutionWidth + 52), SUM(ResolutionWidth + 53), SUM(ResolutionWidth + 54), SUM(ResolutionWidth + 55), SUM(ResolutionWidth + 56), SUM(ResolutionWidth + 57), SUM(ResolutionWidth + 58), SUM(ResolutionWidth + 59), SUM(ResolutionWidth + 60), SUM(ResolutionWidth + 61), SUM(ResolutionWidth + 62), SUM(ResolutionWidth + 63), SUM(ResolutionWidth + 64), SUM(ResolutionWidth + 65), SUM(ResolutionWidth + 66), SUM(ResolutionWidth + 67), SUM(ResolutionWidth + 68), SUM(ResolutionWidth + 69), SUM(ResolutionWidth + 70), SUM(ResolutionWidth + 71), SUM(ResolutionWidth + 72), SUM(ResolutionWidth + 73), SUM(ResolutionWidth + 74), SUM(ResolutionWidth + 75), SUM(ResolutionWidth + 76), SUM(ResolutionWidth + 77), SUM(ResolutionWidth + 78), SUM(ResolutionWidth + 79), SUM(ResolutionWidth + 80), SUM(ResolutionWidth + 81), SUM(ResolutionWidth + 82), SUM(ResolutionWidth + 83), SUM(ResolutionWidth + 84), SUM(ResolutionWidth + 85), SUM(ResolutionWidth + 86), SUM(ResolutionWidth + 87), SUM(ResolutionWidth + 88), SUM(ResolutionWidth + 89) FROM hits;
SELECT SearchEngineID, ClientIP, COUNT(*) AS c, SUM(IsRefresh), AVG(ResolutionWidth) FROM hits WHERE SearchPhrase <> '' GROUP BY SearchEngineID, ClientIP ORDER BY c DESC LIMIT 10;
SELECT WatchID, ClientIP, COUNT(*) AS c, SUM(IsRefresh), AVG(ResolutionWidth) FROM hits WHERE SearchPhrase <> '' GROUP BY WatchID, ClientIP ORDER BY c DESC LIMIT 10;
SELECT WatchID, ClientIP, COUNT(*) AS c, SUM(IsRefresh), AVG(ResolutionWidth) FROM hits GROUP BY WatchID, ClientIP ORDER BY c DESC LIMIT 10;
SELECT URL, COUNT(*) AS c FROM hits GROUP BY URL ORDER BY c DESC LIMIT 10;
SELECT 1, URL, COUNT(*) AS c FROM hits GROUP BY 1, URL ORDER BY c DESC LIMIT 10;
SELECT ClientIP, ClientIP - 1, ClientIP - 2, ClientIP - 3, COUNT(*) AS c FROM hits GROUP BY ClientIP, ClientIP - 1, ClientIP - 2, ClientIP - 3 ORDER BY c DESC LIMIT 10;
SELECT URL, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND DontCountHits = 0 AND IsRefresh = 0 AND URL <> '' GROUP BY URL ORDER BY PageViews DESC LIMIT 10;
SELECT Title, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND DontCountHits = 0 AND IsRefresh = 0 AND Title <> '' GROUP BY Title ORDER BY PageViews DESC LIMIT 10;
SELECT URL, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND IsRefresh = 0 AND IsLink <> 0 AND IsDownload = 0 GROUP BY URL ORDER BY PageViews DESC LIMIT 10 OFFSET 1000;
SELECT TraficSourceID, SearchEngineID, AdvEngineID, CASE WHEN (SearchEngineID = 0 AND AdvEngineID = 0) THEN Referer ELSE '' END AS Src, URL AS Dst, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND IsRefresh = 0 GROUP BY TraficSourceID, SearchEngineID, AdvEngineID, Src, Dst ORDER BY PageViews DESC LIMIT 10 OFFSET 1000;
SELECT URLHash, EventDate, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND IsRefresh = 0 AND TraficSourceID IN (-1, 6) AND RefererHash = 3594120000172545465 GROUP BY URLHash, EventDate ORDER BY PageViews DESC LIMIT 10 OFFSET 100;
SELECT WindowClientWidth, WindowClientHeight, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-01' AND EventDate <= '2013-07-31' AND IsRefresh = 0 AND DontCountHits = 0 AND URLHash = 2868770270353813622 GROUP BY WindowClientWidth, WindowClientHeight ORDER BY PageViews DESC LIMIT 10 OFFSET 10000;
SELECT DATE_TRUNC('minute', EventTime) AS M, COUNT(*) AS PageViews FROM hits WHERE CounterID = 62 AND EventDate >= '2013-07-14' AND EventDate <= '2013-07-15' AND IsRefresh = 0 AND DontCountHits = 0 GROUP BY DATE_TRUNC('minute', EventTime) ORDER BY DATE_TRUNC('minute', EventTime) LIMIT 10 OFFSET 1000;
Loading