當前位置:九游会j9娱乐平台-九游ag登录中心网址 » 編程軟體 » 編譯器無法啟動

編譯器無法啟動-九游会j9娱乐平台

發布時間: 2024-01-19 16:33:32

1. 用vs編譯c 文件時無法啟動程序,並且找不到指定文件怎麼辦

1、用 vs 打開工程,點擊菜單 項目 - 屬性。

2. spark shell因為scala編譯器原因不能正常啟動怎麼解決

spark shell由於scala編譯器原因不能正常啟動

使用sbt安裝完成spark後,可以運行示例,但是嘗試運行spark-shell就會報錯:

d:\scala\spark\bin\spark-shell.cmd
slf4j: class path contains multiple slf4j bindings.
slf4j:
found binding in
[jar:file:/d:/scala/spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar!/org/slf4j/impl/staticloggerbinder.class]
slf4j: found binding in
[jar:file:/d:/scala/spark/tools/target/scala-2.10/spark-tools-assembly-0.9.0-incubating.jar!/org/slf4j/impl/staticloggerbinder.class]
slf4j: see http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
slf4j: actual binding is of type [org.slf4j.impl.log4jloggerfactory]
14/04/03 20:40:43 info httpserver: using spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/04/03 20:40:43 info httpserver: starting http server

failed to initialize compiler: object scala.runtime in compiler mirror not found.
** note that as of 2.8 scala does not assume use of the java classpath.
** for the old behavior pass -usejavacp to scala, or if using a settings
** object programatically, settings.usejavacp.value = true.
14/04/03
20:40:44 warn sparkiloop$sparkiloopinterpreter: warning: compiler
accessed before init set up.  assuming no postinit code.

failed to initialize compiler: object scala.runtime in compiler mirror not found.
** note that as of 2.8 scala does not assume use of the java classpath.
** for the old behavior pass -usejavacp to scala, or if using a settings
** object programatically, settings.usejavacp.value = true.
failed to initialize compiler: object scala.runtime in compiler mirror not found.
        at scala.predef$.assert(predef.scala:179)
       
at
org.apache.spark.repl.sparkimain.initializesynchronous(sparkimain.scala:197)
       
at
org.apache.spark.repl.sparkiloop$$anonfun$process$1.apply$mcz$sp(sparkiloop.scala:919)
       
at
org.apache.spark.repl.sparkiloop$$anonfun$process$1.apply(sparkiloop.scala:876)
       
at
org.apache.spark.repl.sparkiloop$$anonfun$process$1.apply(sparkiloop.scala:876)
       
at
scala.tools.nsc.util.scalaclassloader$.savingcontextloader(scalaclassloader.scala:135)
       
at org.apache.spark.repl.sparkiloop.process(sparkiloop.scala:876)
       
at org.apache.spark.repl.sparkiloop.process(sparkiloop.scala:968)
        at org.apache.spark.repl.main$.main(main.scala:31)
        at org.apache.spark.repl.main.main(main.scala)

google
之還是不求解。只是在sbt的網站上看到q&a裡面有個問題提到了:http://www.scala-sbt.org/release
/docs/faq#how-do-i-use-the-scala-interpreter-in-my-code。這里說代碼中怎麼修改設置。顯然不
適合我。

繼續求解。注意到錯誤提示是在2.8以後才有的,原因是有一個關於編譯器解釋權classpath的提議被接受了:default compiler/interpreter classpath in a managed environment。

繼續在google中找,有一篇論文吸引了我的注意:object scala found。裡面終於找到一個辦法:



however, a working command can be recovered, like so:
$ jrunscript -djava.class.path=scala-library.jar -dscala.usejavacp=true -classpath scala-compiler.jar -l scala



於是修改一下\bin\spark-class2.cmd:

rem set java_opts to be able to load native libraries and to set heap size
set
java_opts=%our_java_opts% -djava.library.path=%spark_library_path%
-dscala.usejavacp=true -xms%spark_mem% -xmx%spark_mem%
rem attention: when changing the way the java_opts are assembled, the change must be reflected in executorrunner.scala!

標紅的部分即是心添加的一個參數。再次運行\bin\spark-shell.cmd:

d:>d:\scala\spark\bin\spark-shell.cmd
slf4j: class path contains multiple slf4j bindings.
slf4j:
found binding in
[jar:file:/d:/scala/spark/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar!/org/slf4j/impl/staticloggerbinder.class]
slf4j: found binding in
[jar:file:/d:/scala/spark/tools/target/scala-2.10/spark-tools-assembly-0.9.0-incubating.jar!/org/slf4j/impl/staticloggerbinder.class]
slf4j: see http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
slf4j: actual binding is of type [org.slf4j.impl.log4jloggerfactory]
14/04/03 22:18:41 info httpserver: using spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/04/03 22:18:41 info httpserver: starting http server
welcome to
     

____             
__
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 0.9.0
      /_/

using scala version 2.10.3 (java hotspot(tm) client vm, java 1.6.0_10)
type in expressions to have them evaluated.
type :help for more information.
14/04/03 22:19:12 info slf4jlogger: slf4jlogger started
14/04/03 22:19:13 info remoting: starting remoting
14/04/03 22:19:16 info remoting: remoting started; listening on addresses :[akka.tcp://spark@choco-pc:5960]
14/04/03 22:19:16 info remoting: remoting now listens on addresses: [akka.tcp://spark@choco-pc:5960]
14/04/03 22:19:16 info sparkenv: registering blockmanagermaster
14/04/03
22:19:17 info diskblockmanager: created local directory at
c:\users\choco\appdata\local\temp\spark-local-20140403221917-7172
14/04/03 22:19:17 info memorystore: memorystore started with capacity 304.8 mb.
14/04/03 22:19:18 info connectionmanager: bound socket to port 5963 with id = connectionmanagerid(choco-pc,5963)
14/04/03 22:19:18 info blockmanagermaster: trying to register blockmanager
14/04/03 22:19:18 info blockmanagermasteractor$blockmanagerinfo: registering block manager choco-pc:5963 with 304.8 mb ram
14/04/03 22:19:18 info blockmanagermaster: registered blockmanager
14/04/03 22:19:18 info httpserver: starting http server
14/04/03 22:19:18 info httpbroadcast: broadcast server started at http://192.168.1.100:5964
14/04/03 22:19:18 info sparkenv: registering mapoutputtracker
14/04/03
22:19:18 info httpfileserver: http file server directory is
c:\users\choco\appdata\local\temp\spark-e122cfe9-2d62-4a47-920c-96b54e4658f6
14/04/03 22:19:18 info httpserver: starting http server
14/04/03 22:19:22 info sparkui: started spark web ui at http://choco-pc:4040
14/04/03 22:19:22 info executor: using repl class uri: http://192.168.1.100:5947
created spark context..
spark context available as sc.

scala> :quit
stopping spark context.
14/04/03 23:05:21 info mapoutputtrackermasteractor: mapoutputtrackeractor stopped!
14/04/03 23:05:21 info connectionmanager: selector thread was interrupted!
14/04/03 23:05:21 info connectionmanager: connectionmanager stopped
14/04/03 23:05:21 info memorystore: memorystore cleared
14/04/03 23:05:21 info blockmanager: blockmanager stopped
14/04/03 23:05:21 info blockmanagermasteractor: stopping blockmanagermaster
14/04/03 23:05:21 info blockmanagermaster: blockmanagermaster stopped
14/04/03 23:05:21 info sparkcontext: successfully stopped sparkcontext
14/04/03 23:05:21 info remoteactorrefprovider$remotingterminator: shutting down remote daemon.
14/04/03
23:05:21 info remoteactorrefprovider$remotingterminator: remote daemon
shut down; proceeding with flushing remote transports.

good。瀏覽器打開http://choco-pc:4040,就可以看到spark的狀態、環境、執行者等信息了。

這個fix可能只是適用與我的情況。如果還有問題可以再找找相關的資料。

期間還碰到不能找到文件的錯誤。最後發現是java_home設置沒有對。如果你碰到問題了,可以打開腳本的回顯,然後找找原因。

熱點內容
愛奇藝正義聯盟為啥不能緩存 發布:2024-01-20 00:52:13 瀏覽:248
caccess查詢資料庫 發布:2024-01-20 00:43:10 瀏覽:769
xp文件夾圖標更改 發布:2024-01-20 00:43:03 瀏覽:19
python和node 發布:2024-01-20 00:37:12 瀏覽:194
android拖拉 發布:2024-01-20 00:00:49 瀏覽:583
少兒編程課程體系介紹 發布:2024-01-20 00:00:48 瀏覽:846
我說你做下載ftp 發布:2024-01-20 00:00:47 瀏覽:8
安卓驅動培訓哪裡好 發布:2024-01-19 23:55:41 瀏覽:987
python轉為字元串 發布:2024-01-19 23:51:39 瀏覽:272
合同文件夾 發布:2024-01-19 23:50:52 瀏覽:740
网站地图