Skip to content

Commit

Permalink
Merge pull request #66 from astrolabsoftware/packageName
Browse files Browse the repository at this point in the history
Package name
  • Loading branch information
JulienPeloton authored Jul 5, 2018
2 parents 288bf87 + 3c37803 commit ec043b8
Show file tree
Hide file tree
Showing 55 changed files with 188 additions and 175 deletions.
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,12 @@
## 0.1.3

- Add KNN routines ([KNN](https://github.com/astrolabsoftware/spark3D/pull/59), [KNN](https://github.com/astrolabsoftware/spark3D/pull/60), [KNN](https://github.com/astrolabsoftware/spark3D/pull/62))
- Unify API to load data ([Point3DRDD](https://github.com/astrolabsoftware/spark3D/pull/63), [SphereRDD](https://github.com/astrolabsoftware/spark3D/pull/64))
- Speed-up cross-match methods by using native Scala methods ([Scala](https://github.com/astrolabsoftware/spark3D/pull/58))
- Add a new website + spark3D belongs to AstroLab Software ([website](https://astrolabsoftware.github.io/))
- Update tutorials ([tuto](https://astrolabsoftware.github.io/spark3D/).
- Few fixes here and there...

## 0.1.1

- Add scripts to generate test data ([PR](https://github.com/astrolabsoftware/spark3D/pull/34))
Expand Down
5 changes: 5 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
- [05/2018] **GSoC 2018**: spark3D has been selected to the Google Summer of Code (GSoC) 2018. Congratulation to [@mayurdb](https://github.com/mayurdb) who will work on the project this year!
- [06/2018] **Release**: version 0.1.0, 0.1.1
- [07/2018] **New location**: spark3D is an official project of [AstroLab Software](https://astrolabsoftware.github.io/)!
- [07/2018] **Release**: version 0.1.3

## Installation and tutorials

Expand All @@ -21,3 +22,7 @@ See our amazing [website](https://astrolabsoftware.github.io/spark3D/)!
* Mayur Bhosale (mayurdb31 at gmail.com) -- GSoC 2018.

Contributing to spark3D: see [CONTRIBUTING](https://github.com/astrolabsoftware/spark3D/blob/master/CONTRIBUTING.md).

## Support

<p align="center"><img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/lal_logo.jpg"/> <img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/psud.png"/> <img width="100" src="https://github.com/astrolabsoftware/spark-fits/raw/master/pic/1012px-Centre_national_de_la_recherche_scientifique.svg.png"/></p>
7 changes: 3 additions & 4 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,7 @@ import xerial.sbt.Sonatype._
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
version := "0.1.2"
// mainClass in Compile := Some("com.sparkfits.examples.OnionSpace")
version := "0.1.3"
)),
// Name of the application
name := "spark3D",
Expand All @@ -36,7 +35,7 @@ lazy val root = (project in file(".")).
// Do not publish artifact in test
publishArtifact in Test := false,
// Exclude runner class for the coverage
coverageExcludedPackages := "<empty>;com.spark3d.examples*",
coverageExcludedPackages := "<empty>;com.astrolabsoftware.spark3d.examples*",
// Excluding Scala library JARs that are included in the binary Scala distribution
// assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false),
// Shading to avoid conflicts with pre-installed nom.tam.fits library
Expand All @@ -47,7 +46,7 @@ lazy val root = (project in file(".")).
"org.apache.spark" %% "spark-core" % "2.1.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.1.0" % "provided",
// For loading FITS files
"com.github.JulienPeloton" %% "spark-fits" % "0.4.0",
"com.github.astrolabsoftware" %% "spark-fits" % "0.4.0",
// "org.datasyslab" % "geospark" % "1.1.3",
// Uncomment if you want to trigger visualisation
// "com.github.haifengl" % "smile-plot" % "1.5.1",
Expand Down
2 changes: 1 addition & 1 deletion docs/01_installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ toto:~$ spark-shell --jars $JARS --packages $PACKAGES
You will be able to import anything from spark3D

```scala
scala> import com.spark3d.geometryObjects.Point3D
scala> import com.astrolabsoftware.spark3d.geometryObjects.Point3D
scala> // etc...
```
Note that if you make a fat jar (that is building with `sbt assembly` and not `sbt package`), you do not need to specify external dependencies as they are already included in the resulting jar:
Expand Down
12 changes: 6 additions & 6 deletions docs/02_introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ spark3D supports various 3D shapes: points (`Point3D`), spherical shells (`Shell
### Point3D

```scala
import com.spark3d.geometryObjects.Point3D
import com.astrolabsoftware.spark3d.geometryObjects.Point3D

// Cartesian coordinates
val points = new Point3D(x: Double, y: Double, z: Double, isSpherical: Boolean = false)
Expand All @@ -26,7 +26,7 @@ val points = new Point3D(r: Double, theta: Double, phi: Double, isSpherical: Boo
### Shells and Spheres

```scala
import com.spark3d.geometryObjects.ShellEnvelope
import com.astrolabsoftware.spark3d.geometryObjects.ShellEnvelope

// Shell from 3D coordinates + inner/outer radii
val shells = new ShellEnvelope(x: Double, y: Double, z: Double, isSpherical: Boolean, innerRadius: Double, outerRadius: Double)
Expand All @@ -44,7 +44,7 @@ val spheres = new ShellEnvelope(center: Point3D, isSpherical: Boolean, radius: D
### Boxes

```scala
import com.spark3d.geometryObjects.BoxEnvelope
import com.astrolabsoftware.spark3d.geometryObjects.BoxEnvelope

// Box from region defined by three (cartesian) coordinates.
val boxes = new BoxEnvelope(p1: Point3D, p2: Point3D, p3: Point3D)
Expand All @@ -68,7 +68,7 @@ In this tutorial we will review the steps to simply create RDD from 3D data sets
A point is an object with 3 spatial coordinates. In spark3D, you can choose the coordinate system between cartesian `(x, y, z)` and spherical `(r, theta, phi)`. Let's suppose we have a text file (CSV, JSON, or TXT) whose columns are labeled `x`, `y` and `z`, the cartesian coordinates of points:

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD

// We assume filename contains at least 3 columns whose names are `colnames`
// Order of columns in the file does not matter, as they will be re-aranged
Expand All @@ -79,7 +79,7 @@ val pointRDD = new Point3DRDD(spark: SparkSession, filename: String, colnames: S
With FITS data, with data in the HDU #1, you would just do

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD

// We assume hdu#1 of filename contains at least 3 columns whose names are `colnames`
// Order of columns in the file does not matter, as they will be re-aranged
Expand All @@ -96,7 +96,7 @@ A sphere is defined by its center (3 spatial coordinates) plus a radius.
In spark3D, you can choose the coordinate system of the center between cartesian `(x, y, z)` and spherical `(r, theta, phi)`. Let's suppose we have a text file (CSV, JSON, or TXT) whose columns are labeled `r`, `theta`, `phi`, the spherical coordinates and `radius`:

```scala
import com.spark3d.spatial3DRDD.SphereRDD
import com.astrolabsoftware.spark3d.spatial3DRDD.SphereRDD

// We assume filename contains at least 4 columns whose names are `colnames`.
// Order of columns in the file does not matter, as they will be re-aranged
Expand Down
8 changes: 4 additions & 4 deletions docs/03_partitioning.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@ There are currently 2 partitioning implemented in the library:
In the following example, we load `Point3D` data, and we re-partition it with the onion partitioning

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.spark3d.utils.GridType
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.utils.GridType

import org.apache.spark.sql.SparkSession

Expand Down Expand Up @@ -56,8 +56,8 @@ val pointRDD_partitioned = pointRDD.spatialPartitioning(GridType.LINEARONIONGRID
In the following example, we load `Point3D` data, and we re-partition it with the octree partitioning

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.spark3d.utils.GridType
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.utils.GridType

import org.apache.spark.sql.SparkSession

Expand Down
16 changes: 8 additions & 8 deletions docs/04_query.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ The spark3D library contains a number of methods and tools to manipulate 3D RDD.
A Envelope query takes as input a `RDD[Shape3D]` and an envelope, and returns all objects in the RDD intersecting the envelope (contained in and crossing the envelope):

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.spark3d.geometryObjects.{Point3D, ShellEnvelope}
import com.spark3d.spatialOperator.RangeQuery
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.geometryObjects.{Point3D, ShellEnvelope}
import com.astrolabsoftware.spark3d.spatialOperator.RangeQuery

import org.apache.spark.sql.SparkSession

Expand Down Expand Up @@ -53,7 +53,7 @@ Envelope = Sphere |Envelope = Box
A cross-match takes as input two data sets, and return objects matching based on the center distance, or pixel index of objects. Note that performing a cross-match between a data set of N elements and another of M elements is a priori a NxM operation - so it can be very costly! Let's load two `Point3D` data sets:

```scala
import com.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD

import org.apache.spark.sql.SparkSession

Expand All @@ -77,8 +77,8 @@ By default, the two sets are partitioned randomly (in the sense points spatially
In order to decrease the cost of performing the cross-match, you need to partition the two data sets the same way. By doing so, you will cross-match only points belonging to the same partition. For a large number of partitions, you will decrease significantly the cost:

```scala
import com.spark3d.utils.GridType
import com.spark3d.spatialPartitioning.SpatialPartitioner
import com.astrolabsoftware.spark3d.utils.GridType
import com.astrolabsoftware.spark3d.spatialPartitioning.SpatialPartitioner

// nPart is the wanted number of partitions. Default is setA_raw partition number.
// For the spatial partitioning, you can currently choose between LINEARONIONGRID, or OCTREE.
Expand Down Expand Up @@ -114,7 +114,7 @@ Currently, we implemented two methods to perform a cross-match:
Here is an example which returns only elements from B with counterpart in A using distance center:

```scala
import com.spark3d.spatialOperator.CenterCrossMatch
import com.astrolabsoftware.spark3d.spatialOperator.CenterCrossMatch

// Distance threshold for the match
val epsilon = 0.004
Expand All @@ -127,7 +127,7 @@ val xMatchCenter = CenterCrossMatch
and the same using the Healpix indices:

```scala
import com.spark3d.spatialOperator.PixelCrossMatch
import com.astrolabsoftware.spark3d.spatialOperator.PixelCrossMatch

// Shell resolution for Healpix indexing
val nside = 512
Expand Down
2 changes: 1 addition & 1 deletion docs/_pages/home.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ header:
cta_label: "<i class='fas fa-download'></i> Install Now"
cta_url: "/docs/installation/"
caption:
excerpt: 'Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...<br /> <small><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.1">Latest release v0.1.1</a></small><br /><br /> {::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
excerpt: 'Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, ...<br /> <small><a href="https://github.com/astrolabsoftware/spark3D/releases/tag/0.1.3">Latest release v0.1.3</a></small><br /><br /> {::nomarkdown}<iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=star&count=true&size=large" frameborder="0" scrolling="0" width="160px" height="30px"></iframe> <iframe style="display: inline-block;" src="https://ghbtns.com/github-btn.html?user=astrolabsoftware&repo=spark3D&type=fork&count=true&size=large" frameborder="0" scrolling="0" width="158px" height="30px"></iframe>{:/nomarkdown}'
feature_row:
- image_path:
alt:
Expand Down
14 changes: 7 additions & 7 deletions examples/jupyter/CrossMatch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@
},
"outputs": [],
"source": [
"import com.spark3d.spatial3DRDD._\n",
"import com.astrolabsoftware.spark3d.spatial3DRDD._\n",
"import org.apache.spark.sql.SparkSession\n",
"val spark = SparkSession.builder().appName(\"Xmatch\").getOrCreate()\n",
"\n",
Expand Down Expand Up @@ -192,8 +192,8 @@
},
"outputs": [],
"source": [
"import com.spark3d.utils.GridType\n",
"import com.spark3d.spatialPartitioning.SpatialPartitioner\n",
"import com.astrolabsoftware.spark3d.utils.GridType\n",
"import com.astrolabsoftware.spark3d.spatialPartitioning.SpatialPartitioner\n",
"\n",
"// As we are in local mode, and the file is very small, the RDD pointRDD has only 1 partition.\n",
"// For the sake of this example, let's increase the number of partition to 100.\n",
Expand Down Expand Up @@ -244,7 +244,7 @@
}
],
"source": [
"import com.spark3d.spatialOperator.PixelCrossMatch\n",
"import com.astrolabsoftware.spark3d.spatialOperator.PixelCrossMatch\n",
"\n",
"// Shell resolution\n",
"val nside = 512\n",
Expand Down Expand Up @@ -298,7 +298,7 @@
}
],
"source": [
"import com.spark3d.spatialOperator.CenterCrossMatch\n",
"import com.astrolabsoftware.spark3d.spatialOperator.CenterCrossMatch\n",
"\n",
"// Distance threshold for the match\n",
"val epsilon = 0.004\n",
Expand Down Expand Up @@ -336,9 +336,9 @@
"import javax.swing.JFrame\n",
"import javax.swing.JPanel\n",
"\n",
"import com.spark3d.utils.Utils.sphericalToCartesian\n",
"import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian\n",
"import org.apache.spark.rdd.RDD\n",
"import com.spark3d.geometryObjects._\n",
"import com.astrolabsoftware.spark3d.geometryObjects._\n",
"\n",
"\n",
"/** Define palette of colors */\n",
Expand Down
8 changes: 4 additions & 4 deletions examples/jupyter/onion_partitioning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@
"metadata": {},
"outputs": [],
"source": [
"import com.spark3d.spatial3DRDD._\n",
"import com.astrolabsoftware.spark3d.spatial3DRDD._\n",
"import org.apache.spark.sql.SparkSession\n",
"val spark = SparkSession.builder().appName(\"OnionSpace\").getOrCreate()\n",
"\n",
Expand Down Expand Up @@ -137,7 +137,7 @@
},
"outputs": [],
"source": [
"import com.spark3d.utils.GridType\n",
"import com.astrolabsoftware.spark3d.utils.GridType\n",
"\n",
"// As we are in local mode, and the file is very small, the RDD pointRDD has only 1 partition.\n",
"// For the sake of this example, let's increase the number of partition to 5.\n",
Expand Down Expand Up @@ -201,9 +201,9 @@
"import javax.swing.JFrame\n",
"import javax.swing.JPanel\n",
"\n",
"import com.spark3d.utils.Utils.sphericalToCartesian\n",
"import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian\n",
"import org.apache.spark.rdd.RDD\n",
"import com.spark3d.geometryObjects._\n",
"import com.astrolabsoftware.spark3d.geometryObjects._\n",
"\n",
"/** \n",
" * Define palette of colors \n",
Expand Down
4 changes: 2 additions & 2 deletions run_scala.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.1.1
VERSION=0.1.3

# Package it
sbt ++${SBT_VERSION} package
Expand All @@ -31,7 +31,7 @@ display="show"

## Dependencies
jars="lib/jhealpix.jar,lib/swingx-0.9.1.jar"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.3.0,com.github.haifengl:smile-core:1.5.1,com.github.haifengl:smile-plot:1.5.1,com.github.haifengl:smile-math:1.5.1,com.github.haifengl:smile-scala_2.11:1.5.1"
packages="com.github.astrolabsoftware:spark-fits_2.11:0.4.0,com.github.haifengl:smile-core:1.5.1,com.github.haifengl:smile-plot:1.5.1,com.github.haifengl:smile-math:1.5.1,com.github.haifengl:smile-scala_2.11:1.5.1"

# Run it!
spark-submit \
Expand Down
2 changes: 1 addition & 1 deletion run_xmatch_cluster.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ SBT_VERSION=2.11.8
SBT_VERSION_SPARK=2.11

## Package version
VERSION=0.1.1
VERSION=0.1.3

# Package it
sbt ++${SBT_VERSION} package
Expand Down
12 changes: 6 additions & 6 deletions src/main/scala/com/spark3d/examples/CrossMatch.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.spark3d.examples
package com.astrolabsoftware.spark3d.examples

// spark3d lib
import com.spark3d.utils.GridType
import com.spark3d.spatial3DRDD.Point3DRDD
import com.spark3d.spatialPartitioning.SpatialPartitioner
import com.spark3d.spatialOperator.PixelCrossMatch
import com.spark3d.serialization.Spark3dConf.spark3dConf
import com.astrolabsoftware.spark3d.utils.GridType
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.spatialPartitioning.SpatialPartitioner
import com.astrolabsoftware.spark3d.spatialOperator.PixelCrossMatch
import com.astrolabsoftware.spark3d.serialization.Spark3dConf.spark3dConf

// Spark lib
import org.apache.spark.sql.SparkSession
Expand Down
8 changes: 4 additions & 4 deletions src/main/scala/com/spark3d/examples/OnionSpace.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,12 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.spark3d.examples
package com.astrolabsoftware.spark3d.examples

// spark3d lib
import com.spark3d.utils.GridType
import com.spark3d.utils.Utils.sphericalToCartesian
import com.spark3d.spatial3DRDD.Point3DRDD
import com.astrolabsoftware.spark3d.utils.GridType
import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian
import com.astrolabsoftware.spark3d.spatial3DRDD.Point3DRDD

// Spark lib
import org.apache.spark.sql.SparkSession
Expand Down
6 changes: 3 additions & 3 deletions src/main/scala/com/spark3d/geometryObjects/BoxEnvelope.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.spark3d.geometryObjects
package com.astrolabsoftware.spark3d.geometryObjects

import com.spark3d.geometryObjects.Shape3D._
import com.spark3d.utils.Utils.sphericalToCartesian
import com.astrolabsoftware.spark3d.geometryObjects.Shape3D._
import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian

import scala.math._

Expand Down
6 changes: 3 additions & 3 deletions src/main/scala/com/spark3d/geometryObjects/Point3D.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.spark3d.geometryObjects
package com.astrolabsoftware.spark3d.geometryObjects

import com.spark3d.geometryObjects.Shape3D._
import com.spark3d.utils.Utils.sphericalToCartesian
import com.astrolabsoftware.spark3d.geometryObjects.Shape3D._
import com.astrolabsoftware.spark3d.utils.Utils.sphericalToCartesian

/**
* Class for describing a point in 3D space.
Expand Down
6 changes: 3 additions & 3 deletions src/main/scala/com/spark3d/geometryObjects/Shape3D.scala
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.spark3d.geometryObjects
package com.astrolabsoftware.spark3d.geometryObjects

import com.spark3d.utils.Utils._
import com.spark3d.utils.ExtPointing
import com.astrolabsoftware.spark3d.utils.Utils._
import com.astrolabsoftware.spark3d.utils.ExtPointing

import healpix.essentials.HealpixBase
import healpix.essentials.Pointing
Expand Down
Loading

0 comments on commit ec043b8

Please sign in to comment.