Skip to content

Commit e1cc557

Browse files
Version 1.18 has been developed and merged (#4)
* 文档更新 * 更新其中的1.17版本官方文档 * 开始新版本1.18的开发 开始开发任务 * update * 新增矩阵 AGG 函数的逻辑实现 * update * 新增边框对象 * update * update * 完善模板匹配函数的实现,增加步长,提升效率。 * update * update * 矩阵开始被归一化计算组件支持 * Update Case.md * update * 针对图像矩阵进行了合并功能的更新 * 针对IO数据流进行了优化。 * update * 为AS库添加第三方数据源支持,并允许通过摄像头获取到图像数据 * update * update * update * 更新HDFS数据IO组件 * update * 1.18版本发布 * 1.18版本发布,新版本已发布至maven * update log * 更新日志中英文文档链接
1 parent 73ddf1c commit e1cc557

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

61 files changed

+6926
-1332
lines changed

AsLib/LibSrc/.idea/workspace.xml

Lines changed: 1 addition & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
1-
Start testing: Mar 26 11:37 ?D1��������?����??
1+
Start testing: Apr 09 14:04 ?D1��������?����??
22
----------------------------------------------------------
3-
End testing: Mar 26 11:37 ?D1��������?����??
3+
End testing: Apr 09 14:04 ?D1��������?����??

AsLib/libBeardedManZhao.dll

0 Bytes
Binary file not shown.

KnowledgeDocument/Operands-Chinese.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -413,14 +413,16 @@ Table 是AS库中用于进行数据分析的数据对象,其表现形式属于
413413

414414
### DataFrameBuilder 与 DataFrame
415415

416-
DataFrameBuilder 与 DataFrame 分别用于数据的加载与数据的分析操作,在数据的加载过程中可以通过 DataFameBuilder 数据对象快捷同时容易理解的函数构造出一个DataFrame,并使用DataFrame进行数据的处理。
416+
DataFrameBuilder 与 DataFrame 分别用于数据的加载与数据的分析操作,在数据的加载过程中可以通过 DataFameBuilder
417+
数据对象快捷同时容易理解的函数构造出一个DataFrame,并使用DataFrame进行数据的处理。
417418

418419
DataFrame 简称 "DF" 在数据的处理阶段,诸多函数采用SQL风格设计,能够有效降低学习成本,使得精力专注于更重要的事情上,接下来展示下 DataFrameBuilder 的基本使用。
419420

420421
#### 使用 FDataFrame 加载数据
421-
- 读取数据库
422-
在AS库中您可以将数据加载成为 FDataFrame 数据对象,该对象能够实现基本的数据读取与数据处理功能,能够实现有效的数据管控,您可以将数据库中的数据加载成为一个 FDataFrame ,接下来就是有关数据库数据加载的代码示例。
423-
- 需要注意的是,在读取数据库的时候请在项目中导入 JDBC 的驱动类。
422+
423+
- 读取数据库 在AS库中您可以将数据加载成为 FDataFrame 数据对象,该对象能够实现基本的数据读取与数据处理功能,能够实现有效的数据管控,您可以将数据库中的数据加载成为一个 FDataFrame
424+
,接下来就是有关数据库数据加载的代码示例。
425+
- 需要注意的是,在读取数据库的时候请在项目中导入 JDBC 的驱动类。
424426

425427
```java
426428
package zhao.algorithmMagic;
@@ -447,8 +449,8 @@ public class MAIN1 {
447449
}
448450
}
449451
```
450-
- 读取文件系统
451-
针对文件系统的读取,FDataFrame 是可以轻松做到本地文件系统读取的,不需要依赖任何的第三方库就可以实现文件系统的读取,接下来就实现一下具体的步骤!
452+
453+
- 读取文件系统 针对文件系统的读取,FDataFrame 是可以轻松做到本地文件系统读取的,不需要依赖任何的第三方库就可以实现文件系统的读取,接下来就实现一下具体的步骤!
452454

453455
```java
454456
package zhao.algorithmMagic;
@@ -481,6 +483,7 @@ public class MAIN1 {
481483
}
482484
}
483485
```
486+
484487
#### 综合案例
485488

486489
```java

KnowledgeDocument/Operands.md

Lines changed: 18 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -427,19 +427,27 @@ public class MAIN1 {
427427
}
428428
}
429429
```
430+
430431
## Table
431432

432-
Table is the data object used for data analysis in the AS database. Its representation is a table with row and column indexes, which can realize convenient data processing tasks. Data can be loaded and processed in the AS database through the DataFrame object.
433+
Table is the data object used for data analysis in the AS database. Its representation is a table with row and column
434+
indexes, which can realize convenient data processing tasks. Data can be loaded and processed in the AS database through
435+
the DataFrame object.
433436

434437
### DataFrameBuilder & DataFrame
435438

436-
DataFrameBuilder and DataFrame are used for data loading and data analysis respectively. In the process of data loading, a DataFrame can be constructed through DataFameBuilder data object fast and easy to understand functions, and the DataFrame can be used for data processing.
437-
DataFrame is called "DF" for short. In the data processing stage, many functions are designed in SQL style, which can effectively reduce learning costs and focus on more important things. Next, we will show the basic use of DataFrameBuilder.
439+
DataFrameBuilder and DataFrame are used for data loading and data analysis respectively. In the process of data loading,
440+
a DataFrame can be constructed through DataFameBuilder data object fast and easy to understand functions, and the
441+
DataFrame can be used for data processing. DataFrame is called "DF" for short. In the data processing stage, many
442+
functions are designed in SQL style, which can effectively reduce learning costs and focus on more important things.
443+
Next, we will show the basic use of DataFrameBuilder.
438444

439445
#### Load data using FDataFrame
440-
-Read Database
441-
In the AS database, you can load data into an FDataFrame data object, which can realize basic data reading and data processing functions, and effective data control. You can load data in the database into an FDataFrame. Next is the code example about database data loading.
442-
-It should be noted that when reading the database, please import the JDBC driver class in the project.
446+
447+
-Read Database In the AS database, you can load data into an FDataFrame data object, which can realize basic data
448+
reading and data processing functions, and effective data control. You can load data in the database into an FDataFrame.
449+
Next is the code example about database data loading. -It should be noted that when reading the database, please import
450+
the JDBC driver class in the project.
443451

444452
```java
445453
package zhao.algorithmMagic;
@@ -466,8 +474,9 @@ public class MAIN1 {
466474
}
467475
}
468476
```
469-
-Read file system
470-
For the reading of the file system, FDataFrame can easily read the local file system without relying on any third-party library. Next, we will implement the specific steps!
477+
478+
-Read file system For the reading of the file system, FDataFrame can easily read the local file system without relying
479+
on any third-party library. Next, we will implement the specific steps!
471480

472481
```java
473482
package zhao.algorithmMagic;
@@ -500,6 +509,7 @@ public class MAIN1 {
500509
}
501510
}
502511
```
512+
503513
#### Comprehensive case
504514

505515
```java

README-Chinese.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,16 +18,19 @@
1818
<dependency>
1919
<groupId>io.github.BeardedManZhao</groupId>
2020
<artifactId>algorithmStar</artifactId>
21-
<version>1.17</version>
21+
<version>1.18</version>
2222
</dependency>
2323
</dependencies>
2424
```
25+
2526
### AS库的所需依赖
2627

2728
在1.17版本之后,AS库的所有依赖被剥离,更好避免依赖的捆绑问题,减少项目发生冲突的可能性,同时也可以按照开发者的需求使用更加适合的依赖配置项,您可以在这里查看到AS库所依赖的第三方库依赖。
2829

2930
#### 必选依赖项
31+
3032
AS库在进行诸多计算函数的时候会产生一些日志数据,因此AS库的使用需要导入日志依赖项,这个依赖项是必不可少的,请按照如下的方式导入依赖。
33+
3134
```xml
3235
<dependencies>
3336
<!-- 使用 log4j2 的适配器进行绑定 -->
@@ -54,9 +57,11 @@ AS库在进行诸多计算函数的时候会产生一些日志数据,因此AS
5457
</dependency>
5558
</dependencies>
5659
```
60+
5761
#### 可选依赖项
5862

5963
AS库在针对数据库,Spark等各种平台对接的时候,需要使用到第三方依赖程序包,这些包是可选的,如果您不需要使用这些功能,您可以不去导入依赖,如果您需要,可以参考下面的配置。
64+
6065
```xml
6166
<dependencies>
6267
<!-- MySQL数据库连接驱动 如果您需要连接的关系型数据库是其它类型,这里也可以随之修改 -->
@@ -82,6 +87,21 @@ AS库在针对数据库,Spark等各种平台对接的时候,需要使用到
8287
<artifactId>spark-mllib_2.12</artifactId>
8388
<version>3.1.3</version>
8489
</dependency>
90+
91+
<!-- 摄像头依赖库,如果您有需要通过摄像头获取数据对象的需求,可以引入本库 -->
92+
<dependency>
93+
<groupId>com.github.sarxos</groupId>
94+
<artifactId>webcam-capture</artifactId>
95+
<version>0.3.12</version>
96+
</dependency>
97+
98+
<!-- HDFS 输入输出设备依赖库,如果您有需要通过HDFS分布式存储平台进行数据读写的需求,可以引入本库 -->
99+
<dependency>
100+
<groupId>org.apache.hadoop</groupId>
101+
<artifactId>hadoop-client</artifactId>
102+
<version>3.3.1</version>
103+
</dependency>
104+
85105
</dependencies>
86106
```
87107

README.md

Lines changed: 34 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -21,18 +21,25 @@ can add it to your maven project, or you can download it from Releases and manua
2121
<dependency>
2222
<groupId>io.github.BeardedManZhao</groupId>
2323
<artifactId>algorithmStar</artifactId>
24-
<version>1.17</version>
24+
<version>1.18</version>
2525
</dependency>
2626
</dependencies>
2727
```
2828

2929
### Required dependencies of the AS library
3030

31-
After version 1.17, all dependencies of the AS library have been stripped to better avoid binding dependencies and reduce the possibility of project conflicts. At the same time, more suitable dependency configuration items can be used according to the needs of developers. You can view third-party library dependencies on which the AS library depends here.
31+
After version 1.17, all dependencies of the AS library have been stripped to better avoid binding dependencies and
32+
reduce the possibility of project conflicts. At the same time, more suitable dependency configuration items can be used
33+
according to the needs of developers. You can view third-party library dependencies on which the AS library depends
34+
here.
3235

3336
#### Required Dependencies
34-
The AS library generates some log data when performing many calculation functions. Therefore, the use of the AS library requires importing log dependencies, which are essential. Please import the dependencies as follows.
37+
38+
The AS library generates some log data when performing many calculation functions. Therefore, the use of the AS library
39+
requires importing log dependencies, which are essential. Please import the dependencies as follows.
40+
3541
```xml
42+
3643
<dependencies>
3744
<!-- Binding using the adapter of log4j2 -->
3845
<dependency>
@@ -58,18 +65,23 @@ The AS library generates some log data when performing many calculation function
5865
</dependency>
5966
</dependencies>
6067
```
68+
6169
#### Optional Dependencies
6270

63-
When interfacing with various platforms such as databases and Sparks, the AS library needs to use third-party dependency packages, which are optional. If you do not need to use these functions, you may not need to import dependencies. If you need to, you can refer to the following configuration.
71+
When interfacing with various platforms such as databases and Sparks, the AS library needs to use third-party dependency
72+
packages, which are optional. If you do not need to use these functions, you may not need to import dependencies. If you
73+
need to, you can refer to the following configuration.
74+
6475
```xml
76+
6577
<dependencies>
66-
<!-- MySQL database connection driver If the relational database you want to connect to is of another type, you can also modify it here -->
78+
<!-- MySQL database connection driver If the relational database you want to connect to is of another type, you can also modify it here -->
6779
<dependency>
6880
<groupId>mysql</groupId>
6981
<artifactId>mysql-connector-java</artifactId>
7082
<version>8.0.30</version>
7183
</dependency>
72-
<!-- The dependency development package for the three major Spark modules can also be imported if you need to use it here, or not if you don't need it -->
84+
<!-- The dependency development package for the three major Spark modules can also be imported if you need to use it here, or not if you don't need it -->
7385
<dependency>
7486
<groupId>org.apache.spark</groupId>
7587
<artifactId>spark-core_2.12</artifactId>
@@ -80,12 +92,27 @@ When interfacing with various platforms such as databases and Sparks, the AS lib
8092
<artifactId>spark-sql_2.12</artifactId>
8193
<version>3.1.3</version>
8294
</dependency>
83-
95+
8496
<dependency>
8597
<groupId>org.apache.spark</groupId>
8698
<artifactId>spark-mllib_2.12</artifactId>
8799
<version>3.1.3</version>
88100
</dependency>
101+
102+
<!-- Camera device dependency library. If you have a need to obtain data objects through the camera, you can import this dependency. -->
103+
<dependency>
104+
<groupId>com.github.sarxos</groupId>
105+
<artifactId>webcam-capture</artifactId>
106+
<version>0.3.12</version>
107+
</dependency>
108+
109+
<!-- HDFS input/output device dependency library. If you have a need for data reading and writing through the HDFS distributed storage platform, you can introduce this library. -->
110+
<dependency>
111+
<groupId>org.apache.hadoop</groupId>
112+
<artifactId>hadoop-client</artifactId>
113+
<version>3.3.1</version>
114+
</dependency>
115+
89116
</dependencies>
90117
```
91118

0 commit comments

Comments
 (0)