Joke Collection Website - Blessing messages - What to learn in Java development and how to learn.
What to learn in Java development and how to learn.
First, the first stage: static web page foundation (HTMLCSS)
1. difficulty: one star
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: html common tags, CSS common layout, style, positioning, static page design and production methods.
4. The description is as follows:
Technically, the technical code used in this stage is simple, easy to learn and easy to understand. From the later course level, because we focus on big data, we need to exercise programming skills and thinking in the early stage. According to the analysis of our project manager who has been developing and teaching for many years, J2EE is the best technology to understand and master in the market at present, but J2EE cannot be separated from page technology. So in the first stage, our focus is on page technology. Adopt the mainstream HTMlCSS in the market.
Second, the second stage: JavaSEJavaWeb
1. difficulty: two stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: java basic grammar, java object-oriented (class, object, encapsulation, inheritance, polymorphism, abstract class, interface, public class, inner class, public modifier, etc. ), exception, collection, file, IO, MYSQL (basic SQL statement operation, multi-table query, sub-query, stored procedure, transaction, distributed transaction) JDBC and so on.
4. The description is as follows:
It's called Java Foundation, with technical points from simple to deep, module analysis of real business projects, and design of various storage methods.
And implementation. This stage is the most important of the first four stages, because all the later stages are based on this stage, and it is also the stage with the highest learning level of big data. At this stage, the team will be contacted for the first time to develop and produce a real project (comprehensive application of the first and second stage technologies).
Third, the third stage: front-end frame
1. Summary procedure: two stars.
2. Class hours (comprehensive ability of projects and tasks in technical knowledge stage): 64 class hours.
3. The main technologies include the combination of Java, Jquery and annotation reflection, XML and XML parsing, the new features of dom4j, jxab and jdk8.0, SVN, Maven and easyui.
4. The description is as follows:
On the basis of the first two stages, turning static into dynamic can enrich the content of our webpage. Of course, if there are professional front-end designers from the perspective of market personnel, the goal of our current design is that front-end technology can exercise people's thinking and design ability more intuitively. At the same time, we also integrate the advanced functions of the second stage into this stage. Let learners walk up a flight of stairs.
The fourth stage: enterprise-level development framework
1. Summary procedure: three stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: Hibernate, Spring, SpringMVC, log4jslf4j integration, myBatis, struts2, Shiro, redis, process engine activity, reptile technology nutch, lucene, Tomcat cluster and hot standby, MySQL read-write separation.
4. The description is as follows:
If we compare the whole JAVA course to a pastry shop, we can make a Wu Dalang sesame cake in the first three stages (because it is handmade-too much trouble), while the learning framework can open a Starbucks (high-tech equipment-saving time and effort). As far as the post requirements of J2EE development engineer are concerned, the technologies used in this stage must be mastered. The courses we teach are higher than the market (there are three mainstream frameworks in the market, and we teach seven framework technologies), and they are driven by real commercial projects. Requirements document, overall design, detailed design, source code testing, deployment, installation manual, etc. I will explain.
Fifth, the fifth stage: understanding big data.
1. difficulty: three stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: the first part of big data (what is big data, application scenarios, how to learn big database, virtual machine concept and installation, etc. ), Linux common commands (file management, system management, disk management), LinuxShell programming (Shell variables, loop control, application programs), hadoop introduction (Hadoop composition, independent environment, directory structure, HDFS interface, MR interface, simple SHELL, java accessing Hadoop), HDFS (introduction, SHELL, use of IDEA development tools, Construction of fully distributed cluster), MapRece application (intermediate computing process, Java operation MapRece, program running, log monitoring), Hadoop advanced application (YARN framework introduction, configuration items and optimization, CDH introduction, environment construction 65.
4. The description is as follows:
This stage aims to make newcomers have a relatively large concept of big data. What does it matter? After studying the JAVA foundation course, I can understand how the program runs on a single computer. So, what about big data? Big data is processed by running programs in a large-scale machine cluster. Of course, big data is for processing data, so similarly, data storage has changed from single-machine storage to multi-machine large-scale cluster storage.
(You ask me what a cluster is? Ok, I have a big pot of rice. I can finish it by myself, but it will take a long time. Now I invite everyone to have dinner together. Call someone when you are alone. What if there are too many people? Is it a crowd? )
Then big data can be roughly divided into: big data storage and big data processing. So at this stage, our course has designed the standard of big data: HADOOP big data is not running on WINDOWS7 or W 10, which we often use, but the most widely used system: LINUX.
Stage 6: Big Data Database
1. difficulty: four stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include Hive introduction (Hive introduction, Hive usage scenario, environment construction, architecture description and working mechanism), HiveShell programming (table building, query statement, partition and bucket, index management and view), Hive advanced application (DISTINCT implementation, groupby, join, sql conversion principle, java programming, configuration and optimization) and hbase introduction. HbaseSHELL programming (DDL, DML, Java operation table construction, query, compression, filtering), detailed description of Hbase module (introduction of REGION, HREGIONSERVER, HMASTER, ZOOKEEPER, configuration of ZOOKEEPER, integration of Hbase and Zookeeper), advanced features of HBASE (reading and writing process, data model, reading and writing hotspots of pattern design, optimization and configuration).
4. The description is as follows:
This stage aims to let everyone know how big data can handle large-scale data. Simplify programming time and improve reading speed.
How to simplify? In the first stage, it is very complicated to write MR programs by yourself if complex business association and data mining are needed. So at this stage, we introduced HIVE, a data warehouse in big data. Here is a key word, data warehouse. I know you will ask me, so I will say first that a data warehouse is usually a huge data center for data mining analysis, which stores these data, usually large databases such as ORACLE and DB2. These databases are usually used for real-time online business.
In a word, data analysis based on data warehouse is relatively slow. But conveniently, as long as you are familiar with SQL and it is relatively simple to learn, HIVE is such a tool, an SQL query tool based on big data, and this stage also includes HBASE, which is a database in big data. I wonder, didn't you learn a data "warehouse" called HIVE? HIVE is based on MR, so the query is quite slow. HBASE can query data in real time based on big data. One main analysis and another main query.
Stage 7: Real-time data acquisition
1. Simple procedure: four stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: acquisition of Flume log, introduction of KAFKA (message queue, application scenario, cluster construction), detailed explanation of KAFKA (partition, theme, receiver, sender, integration with ZOOKEEPER, Shell development, Shell debugging), advanced use of KAFKA (java development, main configuration, optimization project), data visualization (introduction of graphics and charts, classification of chart tools). Brief introduction of STORM (design idea, application scenario, processing flow, cluster installation), STORM development (STROMMVN development, writing local programs of STORM), STORM advanced (java development, main configuration, optimization project), timeliness of KAFKA asynchronous sending and batch sending, orderly KAFKA global messages, and multi-concurrent optimization of Storm.
4. The description is as follows:
The data source of the previous stage is based on the existing large-scale data sets, and the results after data processing and analysis are delayed to some extent. Usually, the data processed is the data of the previous day.
Example scenarios: website security chain, abnormal customer accounts and real-time credit investigation. What if these scenarios are analyzed according to the data of the previous day? Is it too late? So at this stage, we introduce real-time data acquisition and analysis. It mainly includes: FLUME real-time data acquisition supported by a wide range of sources, KAFKA data receiving and sending, STORM real-time data processing and data processing in seconds.
Eight. Stage 8: Spark data analysis
1. Summary procedure: five stars
2. Class hours (comprehensive ability of project tasks in technical knowledge stage)
3. The main technologies include: introduction of SCALA (data types, operators, control statements, basic functions), advanced use of SCALA (data structures, classes, objects, features, pattern matching, regular expressions), advanced use of SCALA (high-order functions, Cory functions, partial functions, tail iteration, self-contained high-order functions, etc.). ), and the introduction of SPARK (environmental construction, infrastructure, operation mode, etc. ). SPARKSQL, SPARK Advanced (data frame, data set, SPARKSTREAMING principle, SPARKSTREAMING support source, integration of KAFKA and SOCKET, programming model), SPARK Advanced Programming (SPARK-GraphX, SPARK-Mllib machine learning), Spark Advanced Application (system architecture, main configuration and performance optimization, fault and phase recovery), SPARKMLKMEANS algorithm.
4. The description is as follows:
Let's talk about the previous stage, mainly the first stage. HADOOP is relatively slow in analyzing large-scale data sets based on MR (including machine learning and artificial intelligence). And is not suitable for iterative calculation. SPARK is an alternative product of MR in analysis. How to replace? Let's talk about their operating mechanism first. HADOOP is based on disk storage analysis while SPARK is based on memory analysis. You may not understand what I said, but more vividly, it's like taking a train from Beijing to Shanghai. MR is a green leather train, and SPARK is a high-speed rail or maglev. On the other hand, SPARK is developed on the basis of SCALA language. Of course, it supports SCALA best, so learn the SCALA development language first in the course.
In the design of the data course of HKUST, the positions in the market have technical requirements and are basically fully covered. Moreover, it does not simply cover the job requirements, but the course itself is a complete big data project process from the front end to the back end.
For example, from historical data storage and analysis (HADOOP, HIVE, HBASE) to real-time data storage (FLUME, KAFKA) and analysis (STORM, SPARK), these are interdependent in practical projects.
- Previous article:My mother-in-law scolded my mother. Am I right to answer back?
- Next article:Does Bank of China have a short message starting with 1069?
- Related articles
- What is the verification code of genius?
- How to send love messages to iphone 7?
- Congratulations on celebrating the Mid-Autumn Festival in advance
- Why does my Samsung mobile phone sch 1829 turn off mobile data and power? When I turned on my mobile phone, I saw that the mobile data was automatically turned on.
- Network short message sending and receiving
- The mobile phone rented by everyone has no explanation of overdue lock.
- What are the collection preferences of famous celebrities?
- What is the income and expenditure of mobile banking in credit cooperatives before SMS reminder?
- Function Introduction of Huawei Watch gt2e
- What animal is a dragon snake? Show a picture.