HDFS Java API Programming

Level 1: File reading and writing Knowledge points 1.HDFS file creation and operation steps step1: Get the FileSystem object; step2: Write through FSDataOutputStream; step3: Output the file content through FSDataInputStream. Programming requirements Get the system settings of hadoop and create an HDFS file in it. The file path is /user/hadoop/myfile; Add the string https://www.educoder.net in […]

FastDFS+Nginx installation configuration

FastDFS + Nginx installation configuration Directory Download and upload the installation package to the server project address: FastDFS installation 1. Install FastDFS dependent libraries 1. Compilation environment 2. Install libfastcommon and libserverframe libraries 2. Install FastDFS and configure it 3. Start (choose one of the following startup methods) File mode startup Configure systemd startup 1. […]

DFS and BFS priority search algorithms

1. What are BFS and DFS 1.1 What is BFS BFS (Breadth-First Search) is a graph traversal algorithm that starts from a starting point and expands the search range layer by layer until the target node is found. This algorithm is often used to solve “shortest path” problems, such as finding the shortest path from […]

Flume construction and installation When uploading HDFS web pages, the connection was refused…Trying::1…telnet: connect to address::1: Connection refused

Table of Contents 1. Flume 1.Features of Flume: 2. What can Flume do? 3. Flume collection and storage 4. Flume’s three major components 5. Flume official website connection Chinese version 2. Install Flume (1) Upload and decompress the software package (2) Configure environment variables 3. Test Flume (1) Edit the Flume configuration file and start […]

Using Java API to operate HDFS

(1) Experimental principle The experimental principle of using Java API to operate HDFS is as follows: Configure the Hadoop environment: First, you need to configure the Hadoop environment, including setting the Hadoop installation path, configuring core-site.xml and hdfs-site.xml files so that Java programs can connect to HDFS. Introducing Hadoop dependencies: In Java projects, you need […]

Use FastDFS and Nginx for port mapping to achieve remote access to local file servers

Article directory Preface 1. Build the FastDFS file system locally 1.1 Environment installation 1.2 Install libfastcommon 1.3 Install FastDFS 1.4 Configure Tracker 1.5 Configure Storage 1.6 Test upload and download 1.7 Integration with Nginx 1.8 Install Nginx 1.9 Configure Nginx 2. LAN test access to FastDFS 3. Install cpolar intranet penetration 4. Configure public network […]

Writing and using HDFS Java API

First Need to set up a Hadoop environment and start Hadoop If you haven’t built it yet, you can read this article: Hadoop cluster construction and configuration-CSDN Blog Here I use the idea under windows to connect the Hadoop of the virtual machine (1) Install hadoop under windows The installed Hadoop needs to be the […]

Java API to access HDFS

1. Download IDEA Download address: https://www.jetbrains.com/idea/download/?section=windows#section=windows Scroll below to use the free IC version. Run the downloaded exe file. Note that it is best not to install the installation path to the C drive. It can be changed to other disk, and other options can be checked as needed. 2. Create a Java project Run […]