so true

心怀未来,开创未来!
随笔 - 160, 文章 - 0, 评论 - 40, 引用 - 0
数据加载中……

一个简单shell脚本

今天能写出这样一个shell脚本,其实并没有费太大力气,因此并不是说我几经周折终有结果而兴奋,而是觉得自己现在终于可以踏实下来做自己喜欢做的事情,能够专注的去学该学的东西而兴奋。之前学了很多杂七杂八的东西,因为目标不明确,很痛苦,究其根本,是因为不知道自己将从事什么职业,只知道自己想从事IT这行,但具体的工作方向却不知道,因此啥都要学习,这个过程对于我来说很痛苦。因为我是一个比较喜欢踏踏实实做事的人,不做就不做,做就要做得很好。我之前看过一篇关于论述程序员浮躁的文章,写得太精彩了。而里面提到的很多浮躁的做法都在我身上得到了印证,这让我很郁闷。现在,工作定了,我知道该学点啥了,目标专注了,太美好了。

借用Steven Jobs的一番话来说就是:

The only way to be truely satisfied is to do what you believe is great work, and the only way to do great work is to love what you do!

我觉得一个人能做到这一步,真的很幸福,自己去努力,去拼搏,去实现自己的价值,让自己对自己的表现满意,这是我经常对自己说的一句话。

现在的我,工作定了,女友也定了,也就是媳妇定了,我需要做的就是去奋斗,去努力,去拼搏。

我很感谢自己能遇到这样一个媳妇,能支持我,关心我,我不知道自己今后会不会很成功,但是我知道有了这个好内柱,我做什么都踏实。我知道,有了她,我太幸福,我也一定会带给她幸福的,I promise!

 

好了,下面就把代码贴出来吧,呵呵:

#!/bin/sh

cd /hadoop/logs

var="`ls *.log`"
cur=""
name=""
file=log_name.txt

if [ -e $file ]; then
 rm $file
fi

for cur in $var
do
 name=`echo $cur | cut -d'-' -f3`
 
 #cat $cur | grep ^2008 | awk '{print $0 " [`echo $name`]"}' >> $file
 cat $cur | grep ^2008 | sed "s/^.*$/&[$name]/" >> $file
 #awk '{print $0 " [`echo $name`]"}' >> $file
done

cp $file __temp.txt
sort __temp.txt >$file
rm __temp.txt

运行的结果是:

2008-11-14 10:08:47,671 INFO org.apache.hadoop.dfs.NameNode: STARTUP_MSG: [namenode]
2008-11-14 10:08:48,140 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000[namenode]
2008-11-14 10:08:48,171 INFO org.apache.hadoop.dfs.NameNode: Namenode up at: bacoo/192.168.1.34:9000[namenode]
2008-11-14 10:08:48,171 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null[namenode]
2008-11-14 10:08:48,234 INFO org.apache.hadoop.dfs.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.dfs.FSNamesystemMetrics: Initializing FSNamesystemMeterics using context object:org.apache.hadoop.metrics.spi.NullContext[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[namenode]
2008-11-14 10:08:48,875 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[namenode]
2008-11-14 10:08:48,890 INFO org.apache.hadoop.fs.FSNamesystem: Registered FSNamesystemStatusMBean[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 4 edits # 0 loaded in 0 seconds.[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Image file of size 80 loaded in 0 seconds.[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[namenode]
2008-11-14 10:08:48,953 INFO org.apache.hadoop.fs.FSNamesystem: Finished loading FSImage in 657 msecs[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Leaving safe mode after 0 secs.[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes[namenode]
2008-11-14 10:08:49,000 INFO org.apache.hadoop.dfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks[namenode]
2008-11-14 10:08:49,609 INFO org.mortbay.util.Credential: Checking Resource aliases[namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][namenode]
2008-11-14 10:08:50,015 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][namenode]
2008-11-14 10:08:54,656 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@17f11fb[namenode]
2008-11-14 10:08:55,453 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][namenode]
2008-11-14 10:08:55,468 INFO org.apache.hadoop.fs.FSNamesystem: Web-server up at: 0.0.0.0:50070[namenode]
2008-11-14 10:08:55,468 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50070[namenode]
2008-11-14 10:08:55,468 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@61a907[namenode]
2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[namenode]
2008-11-14 10:08:55,484 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9000: starting[namenode]
2008-11-14 10:08:55,515 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9000: starting[namenode]
2008-11-14 10:08:55,531 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9000: starting[namenode]
2008-11-14 10:08:56,015 INFO org.apache.hadoop.dfs.NameNode.Secondary: STARTUP_MSG: [secondarynamenode]
2008-11-14 10:08:56,156 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=SecondaryNameNode, sessionId=null[secondarynamenode]
2008-11-14 10:08:56,468 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
2008-11-14 10:08:56,546 INFO org.mortbay.util.Credential: Checking Resource aliases[secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][secondarynamenode]
2008-11-14 10:08:56,609 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][secondarynamenode]
2008-11-14 10:08:56,953 INFO org.mortbay.jetty.servlet.XMLConfiguration: No WEB-INF/web.xml in file:/E:/cygwin/hadoop/webapps/secondary. Serving files and default/dynamic servlets only[secondarynamenode]
2008-11-14 10:08:56,953 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@b1a4e2[secondarynamenode]
2008-11-14 10:08:57,062 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][secondarynamenode]
2008-11-14 10:08:57,078 INFO org.apache.hadoop.dfs.NameNode.Secondary: Secondary Web-server up at: 0.0.0.0:50090[secondarynamenode]
2008-11-14 10:08:57,078 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50090[secondarynamenode]
2008-11-14 10:08:57,078 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@18a8ce2[secondarynamenode]
2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint Period   :3600 secs (60 min)[secondarynamenode]
2008-11-14 10:08:57,078 WARN org.apache.hadoop.dfs.NameNode.Secondary: Log Size Trigger    :67108864 bytes (65536 KB)[secondarynamenode]
2008-11-14 10:08:59,828 INFO org.apache.hadoop.mapred.JobTracker: STARTUP_MSG: [jobtracker]
2008-11-14 10:09:00,015 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=JobTracker, port=9001[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 3 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 5 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 6 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 8 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server handler 9 on 9001: starting[jobtracker]
2008-11-14 10:09:00,031 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9001: starting[jobtracker]
2008-11-14 10:09:00,125 INFO org.mortbay.util.Credential: Checking Resource aliases[jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][jobtracker]
2008-11-14 10:09:01,703 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][jobtracker]
2008-11-14 10:09:02,312 INFO org.mortbay.util.Container: Started org.mortbay.jetty.servlet.WebApplicationHandler@1cd280b[jobtracker]
2008-11-14 10:09:08,359 INFO org.mortbay.util.Container: Started WebApplicationContext[/,/][jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker up at: 9001[jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.mapred.JobTracker: JobTracker webserver: 50030[jobtracker]
2008-11-14 10:09:08,375 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=[jobtracker]
2008-11-14 10:09:08,375 INFO org.mortbay.http.SocketListener: Started SocketListener on 0.0.0.0:50030[jobtracker]
2008-11-14 10:09:08,375 INFO org.mortbay.util.Container: Started org.mortbay.jetty.Server@16a9b9c[jobtracker]
2008-11-14 10:09:12,984 INFO org.apache.hadoop.mapred.JobTracker: Starting RUNNING[jobtracker]
2008-11-14 10:09:56,894 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG: [datanode]
2008-11-14 10:10:02,516 INFO org.apache.hadoop.mapred.TaskTracker: STARTUP_MSG: [tasktracker]
2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Formatting ...[datanode]
2008-11-14 10:10:08,768 INFO org.apache.hadoop.dfs.Storage: Storage directory /hadoop/hadoopfs/data is not formatted.[datanode]
2008-11-14 10:10:11,343 INFO org.apache.hadoop.dfs.DataNode: Registered FSDatasetStatusMBean[datanode]
2008-11-14 10:10:11,347 INFO org.apache.hadoop.dfs.DataNode: Opened info server at 50010[datanode]
2008-11-14 10:10:11,352 INFO org.apache.hadoop.dfs.DataNode: Balancing bandwith is 1048576 bytes/s[datanode]
2008-11-14 10:10:16,430 INFO org.mortbay.util.Credential: Checking Resource aliases[tasktracker]
2008-11-14 10:10:17,976 INFO org.mortbay.util.Credential: Checking Resource aliases[datanode]
2008-11-14 10:10:20,068 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[datanode]
2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][datanode]
2008-11-14 10:10:20,089 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][datanode]
2008-11-14 10:10:20,725 INFO org.mortbay.http.HttpServer: Version Jetty/5.1.4[tasktracker]
2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/logs,/logs][tasktracker]
2008-11-14 10:10:20,727 INFO org.mortbay.util.Container: Started HttpContext[/static,/static][tasktracker]
2008-11-14 10:10:27,078 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/localhost[jobtracker]
2008-11-14 10:10:32,171 INFO org.apache.hadoop.dfs.StateChange: BLOCK* NameSystem.registerDatanode: node registration from 192.168.1.167:50010 storage DS-1556534590-127.0.0.1-50010-1226628640386[namenode]
2008-11-14 10:10:32,187 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.1.167:50010[namenode]
2008-11-14 10:13:57,171 WARN org.apache.hadoop.dfs.Storage: Checkpoint directory \tmp\hadoop-SYSTEM\dfs\namesecondary is added.[secondarynamenode]
2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 5 Total time for transactions(ms): 0 Number of syncs: 3 SyncTimes(ms): 4125 [namenode]
2008-11-14 10:13:57,187 INFO org.apache.hadoop.fs.FSNamesystem: Roll Edit Log from 192.168.1.34[namenode]
2008-11-14 10:13:57,953 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file fsimage size 80 bytes.[secondarynamenode]
2008-11-14 10:13:57,968 INFO org.apache.hadoop.dfs.NameNode.Secondary: Downloaded file edits size 288 bytes.[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: fsOwner=Zhaoyb,None,root,Administrators,Users,Debugger,Users[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: isPermissionEnabled=true[secondarynamenode]
2008-11-14 10:13:58,593 INFO org.apache.hadoop.fs.FSNamesystem: supergroup=supergroup[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Edits file edits of size 288 edits # 5 loaded in 0 seconds.[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files = 0[secondarynamenode]
2008-11-14 10:13:58,640 INFO org.apache.hadoop.dfs.Storage: Number of files under construction = 0[secondarynamenode]
2008-11-14 10:13:58,718 INFO org.apache.hadoop.dfs.Storage: Image file of size 367 saved in 0 seconds.[secondarynamenode]
2008-11-14 10:13:58,796 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [secondarynamenode]
2008-11-14 10:13:58,921 INFO org.apache.hadoop.dfs.NameNode.Secondary: Posted URL 0.0.0.0:50070putimage=1&port=50090&machine=192.168.1.34&token=-16:145044639:0:1226628551796:1226628513000[secondarynamenode]
2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0 Number of syncs: 0 SyncTimes(ms): 0 [namenode]
2008-11-14 10:13:59,078 INFO org.apache.hadoop.fs.FSNamesystem: Roll FSImage from 192.168.1.34[namenode]
2008-11-14 10:13:59,265 WARN org.apache.hadoop.dfs.NameNode.Secondary: Checkpoint done. New Image Size: 367[secondarynamenode]
2008-11-14 10:29:02,171 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
2008-11-14 10:29:04,187 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
2008-11-14 10:29:06,109 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
2008-11-14 10:29:08,015 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
2008-11-14 10:29:10,031 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
2008-11-14 10:29:11,937 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
2008-11-14 10:29:13,843 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
2008-11-14 10:29:15,765 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
2008-11-14 10:29:17,671 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
2008-11-14 10:29:19,593 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
2008-11-14 10:29:21,078 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
2008-11-14 10:29:21,171 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
2008-11-14 10:34:23,156 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 0 time(s).[secondarynamenode]
2008-11-14 10:34:25,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 1 time(s).[secondarynamenode]
2008-11-14 10:34:27,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 2 time(s).[secondarynamenode]
2008-11-14 10:34:29,078 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 3 time(s).[secondarynamenode]
2008-11-14 10:34:31,000 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 4 time(s).[secondarynamenode]
2008-11-14 10:34:32,906 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 5 time(s).[secondarynamenode]
2008-11-14 10:34:34,921 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 6 time(s).[secondarynamenode]
2008-11-14 10:34:36,828 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 7 time(s).[secondarynamenode]
2008-11-14 10:34:38,640 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 8 time(s).[secondarynamenode]
2008-11-14 10:34:40,546 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: Bacoo/192.168.1.34:9000. Already tried 9 time(s).[secondarynamenode]
2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: Exception in doCheckpoint: [secondarynamenode]
2008-11-14 10:34:41,468 ERROR org.apache.hadoop.dfs.NameNode.Secondary: java.io.IOException: Call failed on local exception[secondarynamenode]
2008-11-14 10:38:43,359 INFO org.apache.hadoop.dfs.NameNode.Secondary: SHUTDOWN_MSG: [secondarynamenode]

我相信,这样就可以按照时间的顺序,把生产的日志好好理一遍顺序了,而且每一个步骤后面还都有了各自对应的node类型。

posted on 2008-11-15 01:23 so true 阅读(1488) 评论(0)  编辑  收藏 所属分类: Linux


只有注册用户登录后才能发表评论。


网站导航: