sbt功能很强大,也很灵活,但是灵活的代价就是比较复杂。
1,sbt下载的包默认在~/.ivy2/目录下,maven下的包在~/.m2下面。sbt可以使用maven的包。
2,sbt可以增量编译,maven不行
3,sbt直接运行,会进入命令行模式下。交互方式,还是比较方便的
4,sbt产生多版本包时,打包会比较麻烦,需要解决冲突。而maven不需要,根据配置的版本号加载包
5,sbt命令根maven命令,基本上差不多,但有一些不同。例如:mvn install,sbt publishLocal
6,加载包的配置不同,本文重点
6.1,配置版本和组织
//sbt organization := "com.demo" name := "myscala" version := "0.1" //maven <groupId>com.demo</groupId> <artifactId>myscala</artifactId> <version>0.1</version>
6.2,加入源
//sbt resolvers += "Scala-Tools Maven2 Repository" at "http://scala-tools.org/repo-releases" //本地maven源 resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository" //maven <repositories> <repository> <id>scala-tools.org</id> <name>Scala-Tools Maven2 Repository</name> <url>http://scala-tools.org/repo-releases</url> </repository> </repositories>
6.3,构建源
//sbt scalaSource in Compile := baseDirectory.value / "src/main/scala" unmanagedResourceDirectories in Compile += baseDirectory.value / "src/main/resources" unmanagedResourceDirectories in Compile += baseDirectory.value / "src/main/conf" //maven <sourceDirectory>src/main/scala</sourceDirectory> <resources> <resource> <directory>src/main/resource</directory> </resource> <resource> <directory>src/main/conf</directory> </resource> </resources>
6.4,加载包,并排除不需要的包
//sbt //方法1 libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.3.0" exclude("javax.servlet") //方法2 libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.3.0" excludeAll( ExclusionRule(organization = "javax.servlet") ) //maven <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.3.0</version> <exclusions> <exclusion> <groupId>javax.servlet</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency>
6.5,打包
//sbt isSnapshot := true publishTo := { val nexus = "http://artifactory.xxxxxx.com/artifactory/" if (isSnapshot.value) Some("snapshots" at nexus + "snapshots") else Some("releases" at nexus + "releases") } credentials += Credentials(Path.userHome / ".ivy2" / ".credentials") //maven <distributionManagement> <repository> <id>artifactory</id> <name>xxxxxx</name> <url>http://artifactory.xxxxxx.com/artifactory/xxxxxx/</url> </repository> <snapshotRepository> <id>artifactory</id> <name>xxxxxx</name> <url>http://artifactory.xxxxxxx.com/artifactory/xxxxxxx/</url> </snapshotRepository> </distributionManagement>
6.6,添加包依赖
//sbt //单个 libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.3.0" % "test" //多个 libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % "2.3.0", "org.apache.spark" % "spark-sql_2.11" % "2.3.0", "com.alibaba" % "fastjson" % "1.2.49" ) //maven <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.3.0</version> <scope>test</scope> </dependency>
转载请注明
作者:海底苍鹰
地址:http://blog.51yip.com/hadoop/2167.html