Merge pull request #1661 from Sherlock113/yeq

Update UI description in log collection guides
This commit is contained in:
Sherlock113 2021-06-02 15:39:29 +08:00 committed by GitHub
commit b5e2f7edef
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
7 changed files with 57 additions and 57 deletions

View File

@ -1,5 +1,5 @@
---
linkTitle: "Log Collections"
linkTitle: "Log Collection"
weight: 8620
_build:

View File

@ -1,5 +1,5 @@
---
title: "Add Elasticsearch as a Receiver (i.e. Collector)"
title: "Add Elasticsearch as a Receiver"
keywords: 'Kubernetes, log, elasticsearch, pod, container, fluentbit, output'
description: 'Learn how to add Elasticsearch to receive logs, events or auditing logs.'
linkTitle: "Add Elasticsearch as a Receiver"
@ -9,7 +9,7 @@ You can use Elasticsearch, Kafka and Fluentd as log receivers in KubeSphere. Thi
## Prerequisites
- You need an account granted a role including the authorization of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the authorization and assign it to an account.
- You need an account granted a role including the permission of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the permission and assign it to an account.
- Before adding a log receiver, you need to enable any of the `logging`, `events` or `auditing` components. For more information, see [Enable Pluggable Components](../../../../pluggable-components/). `logging` is enabled as an example in this tutorial.
@ -17,21 +17,21 @@ You can use Elasticsearch, Kafka and Fluentd as log receivers in KubeSphere. Thi
1. Log in to KubeSphere as `admin`. Click **Platform** in the top left corner and select **Cluster Management**.
2. If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster. If you have not enabled the feature, refer to the next step directly.
{{< notice note >}}
3. On the **Cluster Management** page, go to **Log Collections** in **Cluster Settings**.
If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster.
4. Click **Add Log Collector** and choose **Elasticsearch**.
{{</ notice >}}
![add-receiver](/images/docs/cluster-administration/cluster-settings/log-collections/add-es-as-receiver/add-receiver.png)
2. On the **Cluster Management** page, go to **Log Collection** in **Cluster Settings**.
5. Provide the Elasticsearch service address and port as below:
3. Click **Add Log Receiver** and choose **Elasticsearch**.
4. Provide the Elasticsearch service address and port as below:
![add-es](/images/docs/cluster-administration/cluster-settings/log-collections/add-es-as-receiver/add-es.png)
6. Elasticsearch will appear in the receiver list on the **Log Collections** page, the status of which is **Collecting**.
5. Elasticsearch will appear in the receiver list on the **Log Collection** page, the status of which is **Collecting**.
![receiver-list](/images/docs/cluster-administration/cluster-settings/log-collections/add-es-as-receiver/receiver-list.png)
7. To verify whether Elasticsearch is receiving logs sent from Fluent Bit, click **Log Search** in the **Toolbox** in the bottom right corner and search logs on the console. For more information, read [Log Query](../../../../toolbox/log-query/).
6. To verify whether Elasticsearch is receiving logs sent from Fluent Bit, click **Log Search** in the **Toolbox** in the bottom right corner and search logs on the console. For more information, read [Log Query](../../../../toolbox/log-query/).

View File

@ -1,5 +1,5 @@
---
title: "Add Fluentd as a Receiver (i.e. Collector)"
title: "Add Fluentd as a Receiver"
keywords: 'Kubernetes, log, fluentd, pod, container, fluentbit, output'
description: 'Learn how to add Fluentd to receive logs, events or auditing logs.'
linkTitle: "Add Fluentd as a Receiver"
@ -13,7 +13,7 @@ You can use Elasticsearch, Kafka and Fluentd as log receivers in KubeSphere. Thi
## Prerequisites
- You need an account granted a role including the authorization of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the authorization and assign it to an account.
- You need an account granted a role including the permission of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the permission and assign it to an account.
- Before adding a log receiver, you need to enable any of the `logging`, `events` or `auditing` components. For more information, see [Enable Pluggable Components](../../../../pluggable-components/). `logging` is enabled as an example in this tutorial.
@ -120,29 +120,32 @@ spec:
EOF
```
## Step 2: Add Fluentd as a Log Receiver (i.e. Collector)
## Step 2: Add Fluentd as a Log Receiver
1. Log in to KubeSphere as `admin`. Click **Platform** in the top left corner and select **Cluster Management**.
2. If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster. If you have not enabled the feature, refer to the next step directly.
3. On the **Cluster Management** page, go to **Log Collections** in **Cluster Settings**.
4. Click **Add Log Collector** and choose **Fluentd**.
{{< notice note >}}
![add-receiver](/images/docs/cluster-administration/cluster-settings/log-collections/add-fluentd-as-receiver/add-receiver.png)
If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster.
5. Provide the Fluentd service address and port as below:
{{</ notice >}}
2. On the **Cluster Management** page, go to **Log Collection** in **Cluster Settings**.
3. Click **Add Log Receiver** and choose **Fluentd**.
4. Provide the Fluentd service address and port as below:
![add-fluentd](/images/docs/cluster-administration/cluster-settings/log-collections/add-fluentd-as-receiver/add-fluentd.png)
6. Fluentd will appear in the receiver list on the **Log Collections** page, the status of which is **Collecting**.
5. Fluentd will appear in the receiver list on the **Log Collection** page, the status of which is **Collecting**.
![receiver-list](/images/docs/cluster-administration/cluster-settings/log-collections/add-fluentd-as-receiver/receiver-list.png)
## Step 3: Verify Fluentd is Receiving Logs Sent from Fluent Bit
1. Click **Application Workloads** on the **Cluster Management** page.
2. Select **Workloads** and then select the `default` project from the drop-down list in the **Deployments** tab.
2. Select **Workloads** and then select the `default` project from the drop-down list on the **Deployments** tab.
3. Click the **fluentd** item and then select the **fluentd-xxxxxxxxx-xxxxx** Pod.

View File

@ -1,5 +1,5 @@
---
title: "Add Kafka as a Receiver (i.e. Collector)"
title: "Add Kafka as a Receiver"
keywords: 'Kubernetes, log, kafka, pod, container, fluentbit, output'
description: 'Learn how to add Kafka to receive logs, events or auditing logs.'
linkTitle: "Add Kafka as a Receiver"
@ -13,7 +13,7 @@ You can use Elasticsearch, Kafka and Fluentd as log receivers in KubeSphere. Thi
## Prerequisites
- You need an account granted a role including the authorization of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the authorization and assign it to an account.
- You need an account granted a role including the permission of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the permission and assign it to an account.
- Before adding a log receiver, you need to enable any of the `logging`, `events` or `auditing` components. For more information, see [Enable Pluggable Components](../../../../pluggable-components/). `logging` is enabled as an example in this tutorial.
## Step 1: Create a Kafka Cluster and a Kafka Topic
@ -103,21 +103,25 @@ You can use [strimzi-kafka-operator](https://github.com/strimzi/strimzi-kafka-op
1. Log in to KubeSphere as `admin`. Click **Platform** in the top left corner and select **Cluster Management**.
2. If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster. If you have not enabled the feature, refer to the next step directly.
{{< notice note >}}
3. On the **Cluster Management** page, go to **Log Collections** in **Cluster Settings**.
If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster.
4. Click **Add Log Collector** and select **Kafka**. Enter the Kafka broker address and port as below, and then click **OK** to continue.
{{</ notice >}}
```bash
my-cluster-kafka-0.my-cluster-kafka-brokers.default.svc 9092
my-cluster-kafka-1.my-cluster-kafka-brokers.default.svc 9092
my-cluster-kafka-2.my-cluster-kafka-brokers.default.svc 9092
```
2. On the **Cluster Management** page, go to **Log Collection** in **Cluster Settings**.
3. Click **Add Log Receiver** and select **Kafka**. Enter the Kafka broker address and port as below, and then click **OK** to continue.
| Address | Port |
| ------------------------------------------------------- | ---- |
| my-cluster-kafka-0.my-cluster-kafka-brokers.default.svc | 9092 |
| my-cluster-kafka-1.my-cluster-kafka-brokers.default.svc | 9092 |
| my-cluster-kafka-2.my-cluster-kafka-brokers.default.svc | 9092 |
![add-kafka](/images/docs/cluster-administration/cluster-settings/log-collections/add-kafka-as-receiver/add-kafka.png)
5. Run the following commands to verify whether the Kafka cluster is receiving logs sent from Fluent Bit:
4. Run the following commands to verify whether the Kafka cluster is receiving logs sent from Fluent Bit:
```bash
# Start a util container

View File

@ -1,7 +1,7 @@
---
title: "Introduction to Log Collections"
title: "Introduction to Log Collection"
keywords: 'Kubernetes, log, elasticsearch, kafka, fluentd, pod, container, fluentbit, output'
description: 'Learn the basics of cluster log collections, including tools and general steps.'
description: 'Learn the basics of cluster log collection, including tools and general steps.'
linkTitle: "Introduction"
weight: 8621
---
@ -12,11 +12,11 @@ This tutorial gives a brief introduction about the general steps of adding log r
## Prerequisites
- You need an account granted a role including the authorization of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the authorization and assign it to an account.
- You need an account granted a role including the permission of **Cluster Management**. For example, you can log in to the console as `admin` directly or create a new role with the permission and assign it to an account.
- Before adding a log receiver, you need to enable any of the `logging`, `events` or `auditing` components. For more information, see [Enable Pluggable Components](../../../../pluggable-components/).
## Add a Log Receiver (i.e. Collector) for Container Logs
## Add a Log Receiver for Container Logs
To add a log receiver:
@ -24,13 +24,15 @@ To add a log receiver:
2. Click **Platform** in the top left corner and select **Cluster Management**.
3. If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster. If you have not enabled the feature, refer to the next step directly.
{{< notice note >}}
4. Go to **Log Collections** in **Cluster Settings**.
If you have enabled the [multi-cluster feature](../../../../multicluster-management/), you can select a specific cluster.
5. Click **Add Log Collector** in the **Logging** tab.
{{</ notice >}}
![log-collections](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/log-collections.png)
3. Go to **Log Collection** under **Cluster Settings** in the sidebar.
4. Click **Add Log Receiver** on the **Logging** tab.
{{< notice note >}}
@ -57,11 +59,9 @@ Kafka is often used to receive logs and serves as a broker to other processing s
If you need to output logs to more places other than Elasticsearch or Kafka, you can add Fluentd as a log receiver. Fluentd has numerous output plugins which can forward logs to various destinations such as S3, MongoDB, Cassandra, MySQL, syslog, and Splunk. [Add Fluentd as a Receiver](../add-fluentd-as-receiver/) demonstrates how to add Fluentd to receive Kubernetes logs.
## Add a Log Receiver (i.e. Collector) for Events or Auditing Logs
## Add a Log Receiver for Events or Auditing Logs
Starting from KubeSphere v3.0.0, the logs of Kubernetes events and the auditing logs of Kubernetes and KubeSphere can be archived in the same way as container logs. The tab **Events** or **Auditing** on the **Log Collections** page will appear if `events` or `auditing` is enabled accordingly in [ClusterConfiguration](https://github.com/kubesphere/kubekey/blob/release-1.1/docs/config-example.md). You can go to the corresponding tab to configure log receivers for Kubernetes events or Kubernetes and KubeSphere auditing logs.
![log-collections-events](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/log-collections-events.png)
Starting from KubeSphere v3.0.0, the logs of Kubernetes events and the auditing logs of Kubernetes and KubeSphere can be archived in the same way as container logs. The tab **Events** or **Auditing** on the **Log Collection** page will appear if `events` or `auditing` is enabled accordingly in [ClusterConfiguration](https://github.com/kubesphere/kubekey/blob/release-1.1/docs/config-example.md). You can go to the corresponding tab to configure log receivers for Kubernetes events or Kubernetes and KubeSphere auditing logs.
Container logs, Kubernetes events and Kubernetes and KubeSphere auditing logs should be stored in different Elasticsearch indices to be searched in KubeSphere. The index prefixes are:
@ -73,26 +73,19 @@ Container logs, Kubernetes events and Kubernetes and KubeSphere auditing logs sh
You can turn a log receiver on or off without adding or deleting it. To turn a log receiver on or off:
1. On the **Log Collections** page, click a log receiver and go to the receiver's detail page.
1. On the **Log Collection** page, click a log receiver and go to the receiver's detail page.
2. Click **More** and select **Change Status**.
![more](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/more.png)
3. Select **Activate** or **Close** to turn the log receiver on or off.
![change-status](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/change-status.png)
4. A log receiver's status will be changed to **Close** if you turn it off, otherwise the status will be **Collecting** on the **Log Collection** page.
4. A log receiver's status will be changed to **Close** if you turn it off, otherwise the status will be **Collecting**.
![receiver-status](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/receiver-status.png)
## Modify or Delete a Log Receiver
You can modify a log receiver or delete it:
1. On the **Log Collections** page, click a log receiver and go to the receiver's detail page.
1. On the **Log Collection** page, click a log receiver and go to the receiver's detail page.
2. Edit a log receiver by clicking **Edit** or **Edit YAML** from the drop-down list.
![more](/images/docs/cluster-administration/cluster-settings/log-collections/introduction/more.png)
3. Delete a log receiver by clicking **Delete Log Collector**.
3. Delete a log receiver by clicking **Delete Log Receiver**.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 43 KiB

After

Width:  |  Height:  |  Size: 145 KiB