Hadoop HDFS启动失败需要格式化

问题描述

我有一个用于HDFS的多节点独立hadoop集群。我能够将数据加载到HDFS,但是每次重新启动计算机并通过import * as React from 'react'; export interface FormProps { } export interface FormState { personal: any } class Form extends React.Component<FormProps,FormState> { constructor(props: FormProps) { super(props) this.state = { personal: { firstName: "",lastName: "",} } } count = 0; handleChange = (event: any) => { let personal = { ...this.state.personal }; personal[event.target.name] = event.target.value; this.setState({ personal }) } renderFields = () => { this.count++; console.log('render fields call count = ',this.count); return ( <div> <h1>first name</h1> <input type="text" name="firstName" value={this.state.personal.firstName} onChange={this.handleChange} /> <h1>last name</h1> <input type="text" name="lastName" value={this.state.personal.lastName} onChange={this.handleChange} /> </div> ) } render() { return ( this.renderFields() ); } } export default Form; 启动群集时,直到执行"personal":{ "fields":[ { "label":"name","type":"text" },{ "label":"id","type":"number" } ] } 这会擦除所有数据后,我才能看到仪表板。 如何启动hadoop集群而不必经历start-dfs.sh

解决方法

在关闭计算机之前,您需要彻底关闭hdfs和namenode(stop-dfs)。否则,您可能会破坏名称节点,导致您需要进行格式化才能恢复到干净状态