Merge pull request #254 from alyiwang/master

Upgrade to play 2.4.8
This commit is contained in:
Yi (Alan) Wang 2016-10-20 13:18:58 -07:00 committed by GitHub
commit dca47a3b75
41 changed files with 905 additions and 942 deletions

View File

@ -4,15 +4,13 @@ jdk:
- oraclejdk8
before_install:
# download play 2.2.4
- wget http://downloads.typesafe.com/play/2.2.4/play-2.2.4.zip
- unzip play-2.2.4.zip && rm play-2.2.4.zip && mv play-2.2.4 $HOME/
- export PLAY_HOME="$HOME/play-2.2.4"
- echo $PLAY_HOME
# change the build file stack size
- sed -i -e 's/-Xss1M/-Xss2M/g' $PLAY_HOME/framework/build
- cat $PLAY_HOME/framework/build
# download activator
- wget https://downloads.typesafe.com/typesafe-activator/1.3.11/typesafe-activator-1.3.11-minimal.zip
- unzip -q typesafe-activator-1.3.11-minimal.zip && rm typesafe-activator-1.3.11-minimal.zip && mv activator-1.3.11-minimal $HOME/
- export ACTIVATOR_HOME="$HOME/activator-1.3.11-minimal"
- echo $ACTIVATOR_HOME
# increase SBT build tool max heap size
- export SBT_OPTS="-Xms1G -Xmx1G -Xss2M"
# elasticsearch
- curl -O https://download.elastic.co/elasticsearch/release/org/elasticsearch/distribution/deb/elasticsearch/2.3.3/elasticsearch-2.3.3.deb && sudo dpkg -i --force-confnew elasticsearch-2.3.3.deb && sudo service elasticsearch restart
@ -44,4 +42,3 @@ before_script:
- mysql -u root -e "GRANT ALL PRIVILEGES ON *.* TO 'travis'@'localhost'"
- cd data-model/DDL; mysql -u root -D wherehows < create_all_tables_wrapper.sql; cd ../..
- sleep 5

View File

@ -1,16 +1,16 @@
# WhereHows [![Build Status](https://travis-ci.org/linkedin/WhereHows.svg?branch=master)](https://travis-ci.org/linkedin/WhereHows)
WhereHows is a data discovery and lineage tool built at LinkedIn. It integrates with all the major data processing systems and collects both catalog and operational metadata from them.
WhereHows is a data discovery and lineage tool built at LinkedIn. It integrates with all the major data processing systems and collects both catalog and operational metadata from them.
Within the central metadata repository, WhereHows curates, associates, and surfaces the metadata information through two interfaces:
Within the central metadata repository, WhereHows curates, associates, and surfaces the metadata information through two interfaces:
* a web application that enables data & linage discovery, and community collaboration
* an API endpoint that empowers automation of data processes/applications
* an API endpoint that empowers automation of data processes/applications
WhereHows serves as the single platform that:
* links data objects with people and processes
* enables crowdsourcing for data knowledge
* provides data governance and provenance based on ownership and lineage
## Documentation
The detailed information can be found in the [Wiki][wiki]
@ -27,21 +27,26 @@ New to Wherehows? Check out the [Getting Started Guide][GS]
### Preparation
First, please get Play Framework in place.
First, please get Play Framework (Activator) in place.
```
wget http://downloads.typesafe.com/play/2.2.4/play-2.2.4.zip
# Download Activator
wget https://downloads.typesafe.com/typesafe-activator/1.3.11/typesafe-activator-1.3.11-minimal.zip
# Unzip, Remove zipped folder, move play folder to $HOME
unzip play-2.2.4.zip && rm play-2.2.4.zip && mv play-2.2.4 $HOME/
unzip -q typesafe-activator-1.3.11-minimal.zip && rm typesafe-activator-1.3.11-minimal.zip && mv activator-1.3.11-minimal $HOME/
# Add PLAY_HOME, GRADLE_HOME. Update Path to include new gradle, alias to counteract issues
echo 'export PLAY_HOME="$HOME/play-2.2.4"' >> ~/.bashrc
# Add ACTIVATOR_HOME, GRADLE_HOME. Update Path to include new gradle, alias to counteract issues
echo 'export ACTIVATOR_HOME="$HOME/activator-1.3.11-minimal"' >> ~/.bashrc
source ~/.bashrc
```
You need to update the file $PLAY_HOME/framework/build to increase the **JVM stack size** (-Xss1M) to 2M or more.
You need to increase the SBT build tool max heap size for building web module
```
echo 'export SBT_OPTS="-Xms1G -Xmx1G -Xss2M"' >> ~/.bashrc
source ~/.bashrc
```
Second, please [setup the metadata repository][DB] in MySQL.
Second, please [setup the metadata repository][DB] in MySQL.
```
CREATE DATABASE wherehows
DEFAULT CHARACTER SET utf8
@ -58,7 +63,7 @@ Execute the [DDL files][DDL] to create the required repository tables in **where
### Build
1. Get the source code: ```git clone https://github.com/linkedin/WhereHows.git```
2. Put a few 3rd-party jar files to **metadata-etl/extralibs** directory. Some of these jar files may not be available in Maven Central or Artifactory. See [the download instrucitons][EXJAR] for more detail. ```cd WhereHows/metadata-etl/extralibs```
2. Put a few 3rd-party jar files to **metadata-etl/extralibs** directory. Some of these jar files may not be available in Maven Central or Artifactory. See [the download instrucitons][EXJAR] for more detail. ```cd WhereHows/metadata-etl/extralibs```
3. Go back to the **WhereHows** root directory and build all the modules: ```./gradlew build```
4. Go back to the **WhereHows** root directory and start the metadata ETL and API service: ```cd backend-service ; $PLAY_HOME/play run```
5. Go back to the **WhereHows** root directory and start the web front-end: ```cd web ; $PLAY_HOME/play run``` Then WhereHows UI is available at http://localhost:9000 by default. For example, ```play run -Dhttp.port=19001``` will use port 19001 to serve UI.

View File

@ -17,7 +17,7 @@ import akka.actor.ActorRef;
import akka.actor.ActorSystem;
import akka.actor.Props;
import akka.actor.Scheduler;
import akka.routing.SmallestMailboxRouter;
import akka.routing.SmallestMailboxPool;
import scala.concurrent.ExecutionContext;
@ -34,7 +34,7 @@ public class ActorRegistry {
public static ActorRef schedulerActor = actorSystem.actorOf(Props.create(SchedulerActor.class), "SchedulerActor");
public static ActorRef etlJobActor =
actorSystem.actorOf(Props.create(EtlJobActor.class).withRouter(new SmallestMailboxRouter(10)), "EtlJobActor");
actorSystem.actorOf(new SmallestMailboxPool(10).props(Props.create(EtlJobActor.class)), "EtlJobActor");
public static ActorRef treeBuilderActor = actorSystem.actorOf(Props.create(TreeBuilderActor.class), "TreeBuilderActor");

View File

@ -40,7 +40,7 @@ public class DatasetController extends Controller {
if (datasetName != null) {
ObjectNode result = UserDao.getWatchers(datasetName);
resultJson.put("return_code", 200);
resultJson.put("watchers", result);
resultJson.set("watchers", result);
}
return ok(resultJson);
}
@ -48,7 +48,7 @@ public class DatasetController extends Controller {
public static Result getDatasetInfo() throws SQLException {
ObjectNode resultJson = Json.newObject();
String datasetIdString = request().getQueryString("datasetId");
if(datasetIdString != null) {
if (datasetIdString != null) {
int datasetId = Integer.valueOf(datasetIdString);
try {
@ -64,8 +64,8 @@ public class DatasetController extends Controller {
}
String urn = request().getQueryString("urn");
if(urn != null) {
if(!Urn.validateUrn(urn)) {
if (urn != null) {
if (!Urn.validateUrn(urn)) {
resultJson.put("return_code", 400);
resultJson.put("error_message", "Urn format wrong!");
return ok(resultJson);
@ -129,7 +129,7 @@ public class DatasetController extends Controller {
if (propertiesLike != null) {
ObjectNode result = DatasetDao.getDatasetUrnForPropertiesLike(propertiesLike);
resultJson.put("return_code", 200);
resultJson.put("dataset_urns", result);
resultJson.set("dataset_urns", result);
}
} catch (Exception e) {
e.printStackTrace();
@ -141,7 +141,7 @@ public class DatasetController extends Controller {
}
public static Result getDatasetDependentsById(Long datasetId)
throws SQLException {
throws SQLException {
ObjectNode resultJson = Json.newObject();
if (datasetId > 0) {
try {
@ -162,7 +162,7 @@ public class DatasetController extends Controller {
}
public static Result getDatasetDependentsByUri(String datasetUri)
throws SQLException {
throws SQLException {
/* expect
* hive:///db_name.table_name
* hive:///db_name/table_name
@ -173,14 +173,14 @@ public class DatasetController extends Controller {
*/
ObjectNode resultJson = Json.newObject();
String[] uri_parts = datasetUri.split(":");
if(uri_parts.length != 2){
if (uri_parts.length != 2) {
resultJson.put("return_code", 400);
resultJson.put("error_message", "Invalid dataset URI");
return ok(resultJson);
}
String dataset_type = uri_parts[0];
String dataset_path = uri_parts[1].substring(2); // start from the 3rd slash
if (dataset_path.indexOf(".") > 0){
if (dataset_path.indexOf(".") > 0) {
dataset_path.replace(".", "/");
}
@ -202,5 +202,4 @@ public class DatasetController extends Controller {
resultJson.put("error_message", "No parameter provided");
return ok(resultJson);
}
}

View File

@ -1,4 +1,4 @@
package shared; /**
/**
* Copyright 2015 LinkedIn Corp. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
@ -11,6 +11,8 @@ package shared; /**
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package shared;
import java.util.HashSet;
import java.util.List;
import java.util.Set;

View File

@ -1,49 +1,29 @@
apply plugin: 'scala'
apply plugin: 'idea'
def findPlayHome(){
project.ext.playHome = System.getenv()['PLAY_HOME']
project.ext.playHome = System.getenv()['ACTIVATOR_HOME']
if (null == project.ext.playHome) {
throw new GradleException('PLAY_HOME env variable not set!')
throw new GradleException('ACTIVATOR_HOME env variable not set!')
}
project.ext.playExec = "${playHome}/play"
project.ext.playExec = "${playHome}/bin/activator"
}
findPlayHome()
repositories{
mavenCentral()
// Play framework manages its own dependencies in a local Ivy repo
jcenter()
maven {
name "typesafe-maven-release"
url "https://repo.typesafe.com/typesafe/maven-releases"
}
ivy {
name "typesafe-ivy-release"
url "https://repo.typesafe.com/typesafe/ivy-releases"
layout "ivy"
}
flatDir name: 'extralibs',
dirs: "${projectDir}/metadata-etl/extralibs"
}
configurations {
// configuration that holds jars to copy into lib
extraLibs
provided
// configuration that holds jars to copy into lib
provided
all*.exclude group: 'org.slf4j', module: 'slf4j-log4j12'
all*.exclude group: 'log4j'
all*.exclude group: 'org.slf4j', module: 'slf4j-log4j12'
all*.exclude group: 'log4j'
all*.resolutionStrategy {
dependencySubstitution {
substitute module('org.slf4j:slf4j-log4j12') with module('ch.qos.logback:logback-classic:1.1.7')
//prefer 'log4j-over-slf4j' over 'log4j'
}
}
all*.resolutionStrategy {
dependencySubstitution {
substitute module('org.slf4j:slf4j-log4j12') with module('ch.qos.logback:logback-classic:1.1.7')
//prefer 'log4j-over-slf4j' over 'log4j'
force 'com.typesafe:config:1.3.1', 'io.netty:netty:3.10.6.Final'
}
}
}
dependencies{
@ -53,10 +33,8 @@ dependencies{
compile project(":metadata-etl")
compile externalDependency.play
compile externalDependency.play_java_jdbc
compile externalDependency.play_ebean
compile externalDependency.play_cache
compile externalDependency.spring_context
compile externalDependency.spring_jdbc
compile externalDependency.typesafe_config
compile externalDependency.netty
compile externalDependency.mockito
compile externalDependency.slf4j_api
compile externalDependency.jasypt

View File

@ -2,26 +2,23 @@ name := "backend-service"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.10.6"
unmanagedJars in Compile <++= baseDirectory map { base =>
val dirs = (base / "metadata-etl/extralibs") +++ (base / "extralibs")
val dirs = base / "extralibs"
(dirs ** "*.jar").classpath
}
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
"mysql" % "mysql-connector-java" % "5.1.22",
"org.springframework" % "spring-context" % "4.1.1.RELEASE",
"org.springframework" % "spring-jdbc" % "4.1.1.RELEASE",
"org.mockito" % "mockito-core" % "1.9.5",
"org.quartz-scheduler" % "quartz" % "2.2.1",
"org.quartz-scheduler" % "quartz-jobs" % "2.2.1",
"mysql" % "mysql-connector-java" % "5.1.40",
"org.mockito" % "mockito-core" % "1.10.19",
"org.slf4j" % "slf4j-api" % "1.7.21",
"org.jasypt" % "jasypt" % "1.9.2",
"org.apache.kafka" % "kafka_2.10" % "0.10.0.0",
"org.apache.kafka" % "kafka-clients" % "0.10.0.0"
"org.apache.kafka" % "kafka_2.10" % "0.10.0.1",
"org.apache.kafka" % "kafka-clients" % "0.10.0.1"
).map(_.exclude("log4j", "log4j"))
.map(_.exclude("org.slf4j", "slf4j-log4j12"))
play.Project.playJavaSettings
.map(_.exclude("org.slf4j", "slf4j-log4j12"))

View File

@ -1,16 +1,24 @@
include "database"
# This is the main configuration file for the application.
# ~~~~~
# Secret key
# ~~~~~
# The secret key is used to secure cryptographics functions.
#
# This must be changed for production, but we recommend not changing it in this file.
#
# See http://www.playframework.com/documentation/latest/ApplicationSecret for more details.
play.crypto.secret = "changeme"
# The application languages
# ~~~~~
application.langs="en"
play.i18n.langs = [ "en" ]
# Global object class
# ~~~~~
# Define the Global object class for this application.
# Default to Global in the root package.
# application.global=Global
application.global=shared.Global
# Router
# ~~~~~
@ -18,54 +26,37 @@ application.langs="en"
# This router will be looked up first when the application is starting up,
# so make sure this is the entry point.
# Furthermore, it's assumed your route file is named properly.
# So for an application router like `conf/my.application.Router`,
# you may need to define a router file `my.application.routes`.
# Default to Routes in the root package (and `conf/routes`)
# application.router=my.application.Routes
# So for an application router like `my.application.Router`,
# you may need to define a router file `conf/my.application.routes`.
# Default to Routes in the root package (and conf/routes)
# play.http.router = my.application.Routes
# Database configuration
# ~~~~~
# You can declare as many datasources as you want.
# By convention, the default datasource is named `default`
#
# db.default.driver=org.h2.Driver
# db.default.url="jdbc:h2:mem:play"
# db.default.user=sa
# db.default.password=""
#
# connection to wherehows mysql database
db.wherehows.driver = com.mysql.jdbc.Driver
db.wherehows.url = ${WHZ_DB_URL}
db.wherehows.username = ${WHZ_DB_USERNAME}
db.wherehows.password = ${WHZ_DB_PASSWORD}
db.wherehows.host = ${WHZ_DB_HOST}
# You can expose this datasource via JNDI if needed (Useful for JPA)
# db.default.jndiName=DefaultDS
# Evolutions
# ~~~~~
# You can disable evolutions if needed
# evolutionplugin=disabled
# play.evolutions.enabled=false
# Ebean configuration
# ~~~~~
# You can declare as many Ebean servers as you want.
# By convention, the default server is named `default`
#
# ebean.default="models.*"
# Logger
# ~~~~~
# You can also configure logback (http://logback.qos.ch/),
# by providing an application-logger.xml file in the conf directory.
# Root logger:
logger.root=ERROR
# Logger used by the framework:
logger.play=INFO
# Logger provided to your application:
logger.application=DEBUG
# You can disable evolutions for a specific datasource if necessary
# play.evolutions.db.default.enabled=false
# if does not have this variable, every job will run
# if have this varialbe, only the id in this list will be scheduled
# scheduler.jobid.whitelist=[1,2,3,4,5,6,7,8,9]
scheduler.check.interval=10
# start the following list of kafka consumer etl jobs
# kafka.consumer.etl.jobid=[44]
application.global=shared.Global
# kafka.consumer.etl.jobid=[44]

View File

@ -1,6 +0,0 @@
# connection to wherehows mysql database
db.wherehows.driver = com.mysql.jdbc.Driver
db.wherehows.url = ${WHZ_DB_URL}
db.wherehows.user = ${WHZ_DB_USERNAME}
db.wherehows.password = ${WHZ_DB_PASSWORD}
db.wherehows.host = ${WHZ_DB_HOST}

View File

@ -0,0 +1,37 @@
<!--
Copyright 2015 LinkedIn Corp. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-->
<configuration>
<conversionRule conversionWord="coloredLevel" converterClass="play.api.Logger$ColoredLevel" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%coloredLevel - %logger - %message%n%xException</pattern>
</encoder>
</appender>
<!--
The logger name is typically the Java/Scala package name.
This configures the log level to log at for a package and its children packages.
-->
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
<root level="ERROR">
<appender-ref ref="STDOUT" />
</root>
</configuration>

View File

@ -1 +1 @@
sbt.version=0.13.5
sbt.version=0.13.12

View File

@ -5,4 +5,4 @@ logLevel := Level.Warn
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.4")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.8")

View File

@ -1,3 +1,16 @@
/**
* Copyright 2015 LinkedIn Corp. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package actors;
import com.google.common.io.Files;
@ -10,7 +23,7 @@ import java.io.IOException;
import java.nio.charset.Charset;
import java.util.Properties;
import static org.fest.assertions.Assertions.assertThat;
import static org.junit.Assert.assertThat;
public class ConfigUtilTest {

View File

@ -1,3 +1,17 @@
/**
* Copyright 2015 LinkedIn Corp. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package controllers;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ArrayNode;
@ -10,19 +24,17 @@ import org.junit.Test;
import play.Logger;
import play.libs.Json;
import play.mvc.Content;
import play.mvc.Http;
import play.mvc.Result;
import play.test.FakeApplication;
import java.lang.Exception;
import java.util.Collections;
import java.util.HashMap;
import java.util.Map;
import static play.test.Helpers.*;
import static org.fest.assertions.Assertions.*;
import static org.junit.Assert.assertThat;
import static org.mockito.Mockito.*;
@ -75,7 +87,6 @@ public class DatasetControllerTest {
{
assertThat(false);
}
}
@AfterClass

View File

@ -6,6 +6,8 @@ apply plugin: 'idea'
apply plugin: 'license'
ext.licenseFile = file('.license_header.txt')
version = '1.0'
configurations.all {
exclude group: 'org.slf4j', module: 'slf4j-log4j12'
exclude group: 'log4j'
@ -19,8 +21,9 @@ configurations.all {
}
subprojects {
apply plugin: 'eclipse'
apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'eclipse'
apply plugin: 'com.github.hierynomus.license'
eclipse {
@ -34,7 +37,6 @@ subprojects {
}
}
license {
header licenseFile
@ -50,16 +52,26 @@ subprojects {
dirs 'extralibs'
}
mavenCentral()
jcenter()
maven {
name "typesafe-maven-release"
url "https://repo.typesafe.com/typesafe/maven-releases"
}
ivy {
name "typesafe-ivy-release"
url "https://repo.typesafe.com/typesafe/ivy-releases"
layout "ivy"
}
maven { // this is required for hive
name "conjars-maven-release"
url 'http://conjars.org/repo'
}
maven {
url 'https://repo.typesafe.com/typesafe/maven-releases/'
}
maven {
maven { // this is required by confluent kafka common
name "confluent-maven-release"
url 'http://packages.confluent.io/maven/'
}
}
task excludegroup(type: Test) {
useTestNG() {
excludeGroups 'needConfig'
@ -70,46 +82,47 @@ subprojects {
}
}
ext.externalDependency = ["mysql" : "mysql:mysql-connector-java:5.1.36",
"jython" : "org.python:jython-standalone:2.7.0",
"testng" : "org.testng:testng:6.9.6",
"hadoop_common" : "org.apache.hadoop:hadoop-common:2.7.1",
"hadoop_client" : "org.apache.hadoop:hadoop-mapreduce-client-core:2.7.1",
"hadoop_auth" : "org.apache.hadoop:hadoop-auth:2.7.1",
"pig" : "org.apache.pig:pig:0.15.0",
"hive_exec" : "org.apache.hive:hive-exec:1.2.1",
"avro" : "org.apache.avro:avro:1.7.7",
"avro_mapred" : "org.apache.avro:avro-mapred:1.7.7",
"joda" : "joda-time:joda-time:2.8.2",
"jsch" : "com.jcraft:jsch:0.1.53",
"http_client" : "org.apache.httpcomponents:httpclient:4.5",
"http_core" : "org.apache.httpcomponents:httpcore:4.4.1",
"json_path" : "com.jayway.jsonpath:json-path:2.0.0",
"akka" : "com.typesafe.akka:akka-actor_2.10:2.2.5",
"jgit" : "org.eclipse.jgit:org.eclipse.jgit:4.1.1.201511131810-r",
"jsoup" : "org.jsoup:jsoup:1.8.3",
"commons_io" : "commons-io:commons-io:2.4",
ext.externalDependency = ["mysql" : "mysql:mysql-connector-java:5.1.40",
"jython" : "org.python:jython-standalone:2.7.0",
"testng" : "org.testng:testng:6.9.10",
"hadoop_common" : "org.apache.hadoop:hadoop-common:2.7.3",
"hadoop_client" : "org.apache.hadoop:hadoop-mapreduce-client-core:2.7.3",
"hadoop_auth" : "org.apache.hadoop:hadoop-auth:2.7.3",
"pig" : "org.apache.pig:pig:0.15.0",
"hive_exec" : "org.apache.hive:hive-exec:1.2.1",
"avro" : "org.apache.avro:avro:1.7.7",
"avro_mapred" : "org.apache.avro:avro-mapred:1.7.7",
"jsch" : "com.jcraft:jsch:0.1.54",
"http_client" : "org.apache.httpcomponents:httpclient:4.5.2",
"http_core" : "org.apache.httpcomponents:httpcore:4.4.5",
"json_path" : "com.jayway.jsonpath:json-path:2.2.0",
"jgit" : "org.eclipse.jgit:org.eclipse.jgit:4.1.2.201602141800-r",
"jsoup" : "org.jsoup:jsoup:1.8.3",
"commons_io" : "commons-io:commons-io:2.5",
"jackson_databind" : "com.fasterxml.jackson.core:jackson-databind:2.6.1",
"jackson_core" : "com.fasterxml.jackson.core:jackson-core:2.6.1",
"jackson_annotations": "com.fasterxml.jackson.core:jackson-annotations:2.6.1",
"jackson_databind" : "com.fasterxml.jackson.core:jackson-databind:2.8.4",
"jackson_core" : "com.fasterxml.jackson.core:jackson-core:2.8.4",
"jackson_annotations" : "com.fasterxml.jackson.core:jackson-annotations:2.8.4",
"slf4j_api" : "org.slf4j:slf4j-api:1.7.21",
"slf4j_log4j" : "org.slf4j:log4j-over-slf4j:1.7.21",
"logback" : "ch.qos.logback:logback-classic:1.1.7",
"jasypt" : "org.jasypt:jasypt:1.9.2",
"slf4j_api" : "org.slf4j:slf4j-api:1.7.21",
"slf4j_log4j" : "org.slf4j:log4j-over-slf4j:1.7.21",
"logback" : "ch.qos.logback:logback-classic:1.1.7",
"jasypt" : "org.jasypt:jasypt:1.9.2",
"mockito" : "org.mockito:mockito-core:1.10.19",
"spring_context" : "org.springframework:spring-context:4.1.1.RELEASE",
"spring_jdbc" : "org.springframework:spring-jdbc:4.1.1.RELEASE",
"mockito" : "org.mockito:mockito-core:1.9.5",
"play" : "com.typesafe.play:play_2.10:2.2.4",
"play_ebean" : "com.typesafe.play:play-java-ebean_2.10:2.2.4",
"play_java_jdbc" : "com.typesafe.play:play-java-jdbc_2.10:2.2.4",
"play_cache" : "com.typesafe.play:play-cache_2.10:2.2.4",
"play" : "com.typesafe.play:play_2.10:2.4.8",
"play_java_jdbc" : "com.typesafe.play:play-java-jdbc_2.10:2.4.8",
"play_java_ws" : "com.typesafe.play:play-java-ws_2.10:2.4.8",
"play_cache" : "com.typesafe.play:play-cache_2.10:2.4.8",
"play_filter" : "com.typesafe.play:filters-helpers_2.10:2.4.8",
"typesafe_config" : "com.typesafe:config:1.3.1",
"netty" : "io.netty:netty:3.10.6.Final",
"akka" : "com.typesafe.akka:akka-actor_2.10:2.3.15",
"spring_jdbc" : "org.springframework:spring-jdbc:4.1.6.RELEASE",
"kafka" : "org.apache.kafka:kafka_2.10:0.10.0.1",
"kafka_clients" : "org.apache.kafka:kafka-clients:0.10.0.1",
"confluent_common_cfg" : "io.confluent:common-config:3.0.1"
"kafka" : "org.apache.kafka:kafka_2.10:0.10.0.1",
"kafka_clients" : "org.apache.kafka:kafka-clients:0.10.0.1",
"confluent_common_cfg": "io.confluent:common-config:3.0.1"
]
task buildWithWarning(type: JavaCompile, dependsOn: build) {

Binary file not shown.

View File

@ -1,4 +1,4 @@
#Mon May 16 14:53:29 CEST 2016
#Wed Oct 19 15:56:42 PDT 2016
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME

52
gradlew vendored
View File

@ -6,12 +6,30 @@
##
##############################################################################
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
@ -30,6 +48,7 @@ die ( ) {
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
@ -40,31 +59,11 @@ case "`uname`" in
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
# For Cygwin, ensure paths are in UNIX format before anything is touched.
if $cygwin ; then
[ -n "$JAVA_HOME" ] && JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
fi
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >&-
APP_HOME="`pwd -P`"
cd "$SAVED" >&-
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
@ -90,7 +89,7 @@ location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
@ -114,6 +113,7 @@ fi
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`

180
gradlew.bat vendored
View File

@ -1,90 +1,90 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windowz variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

View File

@ -1,6 +1,3 @@
apply plugin: 'java'
version = '1.0'
configurations {
// configuration that holds jars to copy into lib
extraLibs
@ -9,7 +6,7 @@ configurations {
all*.exclude group: 'org.slf4j', module: 'slf4j-log4j12'
all*.exclude group: 'log4j'
all*.resolutionStrategy {
all*.resolutionStrategy {
dependencySubstitution {
substitute module('org.slf4j:slf4j-log4j12') with module('ch.qos.logback:logback-classic:1.1.7')
//prefer 'log4j-over-slf4j' over 'log4j'
@ -21,7 +18,6 @@ dependencies {
extraLibs project(":wherehows-common")
//extraLibs files("extralibs/linkedin-pig-0.11.1.49.jar")
//extraLibs files("extralibs/voldemort-0.91.li1.jar")
extraLibs externalDependency.joda
extraLibs externalDependency.avro_mapred
extraLibs externalDependency.hive_exec
extraLibs externalDependency.pig
@ -32,7 +28,6 @@ dependencies {
compile externalDependency.pig
compile externalDependency.avro
compile externalDependency.avro_mapred
compile externalDependency.joda
compile externalDependency.hive_exec
compile externalDependency.http_client
compile externalDependency.http_core

View File

@ -1,5 +1,4 @@
apply plugin: 'application'
version = '1.0'
mainClassName = 'metadata.etl.Launcher'
@ -11,7 +10,7 @@ configurations {
all*.exclude group: 'org.slf4j', module: 'slf4j-log4j12'
all*.exclude group: 'log4j'
all*.resolutionStrategy {
all*.resolutionStrategy {
dependencySubstitution {
substitute module('org.slf4j:slf4j-log4j12') with module('ch.qos.logback:logback-classic:1.1.7')
//prefer 'log4j-over-slf4j' over 'log4j'

View File

@ -18,7 +18,7 @@ import akka.actor.ActorSystem;
import akka.actor.Props;
import akka.dispatch.Futures;
import akka.pattern.Patterns;
import akka.routing.SmallestMailboxRouter;
import akka.routing.SmallestMailboxPool;
import akka.util.Timeout;
import java.sql.Connection;
import java.sql.DriverManager;
@ -83,8 +83,8 @@ public class AzLineageExtractorMaster {
ActorSystem actorSystem = ActorSystem.create("LineageExtractor");
int numOfActor = Integer.valueOf(prop.getProperty(Constant.LINEAGE_ACTOR_NUM, "50"));
ActorRef lineageExtractorActor = actorSystem
.actorOf(Props.create(AzLineageExtractorActor.class)
.withRouter(new SmallestMailboxRouter(numOfActor)), "lineageExtractorActor");
.actorOf(new SmallestMailboxPool(numOfActor).props(Props.create(AzLineageExtractorActor.class)),
"lineageExtractorActor");
// initialize
//AzkabanServiceCommunicator asc = new AzkabanServiceCommunicator(prop);

29
web/app/Filters.java Normal file
View File

@ -0,0 +1,29 @@
/**
* Copyright 2015 LinkedIn Corp. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
import javax.inject.Inject;
import play.api.mvc.EssentialFilter;
import play.filters.gzip.GzipFilter;
import play.http.HttpFilters;
public class Filters implements HttpFilters {
@Inject
GzipFilter gzipFilter;
public EssentialFilter[] filters() {
return new EssentialFilter[] { gzipFilter };
}
}

View File

@ -1,43 +0,0 @@
/**
* Copyright 2015 LinkedIn Corp. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
import play.*;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
import play.Application;
import play.GlobalSettings;
import play.api.mvc.EssentialFilter;
import play.filters.gzip.GzipFilter;
public class Global extends GlobalSettings{
private ApplicationContext applicationContext;
public <T extends EssentialFilter> Class<T>[] filters() {
return new Class[]{GzipFilter.class};
}
@Override
public void onStart(Application arg0) {
applicationContext = new ClassPathXmlApplicationContext("components.xml");
}
@Override
public <A> A getControllerInstance(Class<A> type) throws Exception {
return applicationContext.getBean(type);
}
}

View File

@ -13,8 +13,8 @@
*/
package controllers;
import play.mvc.*;
import play.mvc.Http.*;
import play.mvc.Security;
import play.mvc.Http.Context;
import play.mvc.Result;
public class Secured extends Security.Authenticator

View File

@ -18,14 +18,11 @@ import com.fasterxml.jackson.databind.node.ArrayNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
import models.DatasetColumn;
import models.DatasetDependency;
import models.DatasetListViewNode;
import models.ImpactDataset;
import play.Play;
import play.api.libs.json.JsValue;
import play.api.mvc.SimpleResult;
import play.libs.F;
import play.libs.F.Promise;
import play.libs.Json;
import play.libs.WS;
import play.libs.ws.*;
import play.mvc.BodyParser;
import play.mvc.Controller;
import play.mvc.Result;
@ -36,7 +33,6 @@ import dao.DatasetsDAO;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import scala.concurrent.Future;
public class Dataset extends Controller
@ -45,6 +41,8 @@ public class Dataset extends Controller
public static final String DATASET_SECURITY_PATH = "/dataset/security";
public static final String BACKEND_URL = Play.application().configuration().getString(BACKEND_SERVICE_URL_KEY);
public static Result getDatasetOwnerTypes()
{
ObjectNode result = Json.newObject();
@ -375,7 +373,7 @@ public class Dataset extends Controller
catch(NumberFormatException e)
{
Logger.error("Dataset Controller getPagedDatasetComments wrong page parameter. Error message: " +
e.getMessage());
e.getMessage());
page = 1;
}
}
@ -395,7 +393,7 @@ public class Dataset extends Controller
catch(NumberFormatException e)
{
Logger.error("Dataset Controller getPagedDatasetComments wrong size parameter. Error message: " +
e.getMessage());
e.getMessage());
size = 10;
}
}
@ -439,33 +437,33 @@ public class Dataset extends Controller
public static Result putDatasetComment(int id, int commentId)
{
String body = request().body().asText();
ObjectNode result = Json.newObject();
String username = session("user");
Map<String, String[]> params = request().body().asFormUrlEncoded();
String body = request().body().asText();
ObjectNode result = Json.newObject();
String username = session("user");
Map<String, String[]> params = request().body().asFormUrlEncoded();
if(StringUtils.isNotBlank(username))
{
if(DatasetsDAO.postComment(id, params, username))
if (StringUtils.isNotBlank(username))
{
result.put("status", "success");
return ok(result);
if (DatasetsDAO.postComment(id, params, username))
{
result.put("status", "success");
return ok(result);
}
else
{
result.put("status", "failed");
result.put("error", "true");
result.put("msg", "Could not create comment.");
return badRequest(result);
}
}
else
{
result.put("status", "failed");
result.put("error", "true");
result.put("msg", "Could not create comment.");
return badRequest(result);
result.put("status", "failed");
result.put("error", "true");
result.put("msg", "Unauthorized User");
return badRequest(result);
}
}
else
{
result.put("status", "failed");
result.put("error", "true");
result.put("msg", "Unauthorized User");
return badRequest(result);
}
}
@ -613,7 +611,7 @@ public class Dataset extends Controller
catch(NumberFormatException e)
{
Logger.error("Dataset Controller getPagedDatasetColumnComments wrong page parameter. Error message: " +
e.getMessage());
e.getMessage());
page = 1;
}
}
@ -633,7 +631,7 @@ public class Dataset extends Controller
catch(NumberFormatException e)
{
Logger.error("Dataset Controller getPagedDatasetColumnComments wrong size parameter. Error message: " +
e.getMessage());
e.getMessage());
size = 10;
}
}
@ -702,40 +700,40 @@ public class Dataset extends Controller
ObjectNode json = Json.newObject();
ArrayNode res = json.arrayNode();
JsonNode req = request().body().asJson();
if(req == null) {
if (req == null) {
return badRequest("Expecting JSON data");
}
if(req.isArray()) {
for(int i = 0; i < req.size(); i++) {
if (req.isArray()) {
for (int i = 0; i < req.size(); i++) {
JsonNode obj = req.get(i);
Boolean isSuccess = DatasetsDAO.assignColumnComment(
obj.get("datasetId").asInt(),
obj.get("columnId").asInt(),
obj.get("commentId").asInt());
obj.get("datasetId").asInt(),
obj.get("columnId").asInt(),
obj.get("commentId").asInt());
ObjectNode itemResponse = Json.newObject();
if(isSuccess) {
if (isSuccess) {
itemResponse.put("success", "true");
} else {
itemResponse.put("error", "true");
itemResponse.put("datasetId", datasetId);
itemResponse.put("columnId", columnId);
itemResponse.put("commentId", obj.get("comment_id"));
itemResponse.set("commentId", obj.get("comment_id"));
}
res.add(itemResponse);
}
} else {
Boolean isSuccess = DatasetsDAO.assignColumnComment(
datasetId,
columnId,
req.get("commentId").asInt());
datasetId,
columnId,
req.get("commentId").asInt());
ObjectNode itemResponse = Json.newObject();
if(isSuccess) {
if (isSuccess) {
itemResponse.put("success", "true");
} else {
itemResponse.put("error", "true");
itemResponse.put("datasetId", datasetId);
itemResponse.put("columnId", columnId);
itemResponse.put("commentId", req.get("commentId"));
itemResponse.set("commentId", req.get("commentId"));
}
res.add(itemResponse);
}
@ -761,34 +759,34 @@ public class Dataset extends Controller
public static Result getSimilarColumnComments(Long datasetId, int columnId) {
ObjectNode result = Json.newObject();
result.put("similar", Json.toJson(DatasetsDAO.similarColumnComments(datasetId, columnId)));
result.set("similar", Json.toJson(DatasetsDAO.similarColumnComments(datasetId, columnId)));
return ok(result);
}
public static Result getSimilarColumns(int datasetId, int columnId)
{
ObjectNode result = Json.newObject();
result.put("similar", Json.toJson(DatasetsDAO.similarColumns(datasetId, columnId)));
result.set("similar", Json.toJson(DatasetsDAO.similarColumns(datasetId, columnId)));
return ok(result);
}
public static Result getDependViews(Long datasetId)
{
ObjectNode result = Json.newObject();
List<DatasetDependency> depends = new ArrayList<DatasetDependency>();
List<DatasetDependency> depends = new ArrayList<>();
DatasetsDAO.getDependencies(datasetId, depends);
result.put("status", "ok");
result.put("depends", Json.toJson(depends));
result.set("depends", Json.toJson(depends));
return ok(result);
}
public static Result getReferenceViews(Long datasetId)
{
ObjectNode result = Json.newObject();
List<DatasetDependency> references = new ArrayList<DatasetDependency>();
List<DatasetDependency> references = new ArrayList<>();
DatasetsDAO.getReferences(datasetId, references);
result.put("status", "ok");
result.put("references", Json.toJson(references));
result.set("references", Json.toJson(references));
return ok(result);
}
@ -797,7 +795,7 @@ public class Dataset extends Controller
ObjectNode result = Json.newObject();
String urn = request().getQueryString("urn");
result.put("status", "ok");
result.put("nodes", Json.toJson(DatasetsDAO.getDatasetListViewNodes(urn)));
result.set("nodes", Json.toJson(DatasetsDAO.getDatasetListViewNodes(urn)));
return ok(result);
}
@ -805,7 +803,7 @@ public class Dataset extends Controller
{
ObjectNode result = Json.newObject();
result.put("status", "ok");
result.put("versions", Json.toJson(DatasetsDAO.getDatasetVersions(datasetId, dbId)));
result.set("versions", Json.toJson(DatasetsDAO.getDatasetVersions(datasetId, dbId)));
return ok(result);
}
@ -813,7 +811,7 @@ public class Dataset extends Controller
{
ObjectNode result = Json.newObject();
result.put("status", "ok");
result.put("schema_text", Json.toJson(DatasetsDAO.getDatasetSchemaTextByVersion(datasetId, version)));
result.set("schema_text", Json.toJson(DatasetsDAO.getDatasetSchemaTextByVersion(datasetId, version)));
return ok(result);
}
@ -821,7 +819,7 @@ public class Dataset extends Controller
{
ObjectNode result = Json.newObject();
result.put("status", "ok");
result.put("instances", Json.toJson(DatasetsDAO.getDatasetInstances(datasetId)));
result.set("instances", Json.toJson(DatasetsDAO.getDatasetInstances(datasetId)));
return ok(result);
}
@ -829,7 +827,7 @@ public class Dataset extends Controller
{
ObjectNode result = Json.newObject();
result.put("status", "ok");
result.put("partitions", Json.toJson(DatasetsDAO.getDatasetPartitionGains(datasetId)));
result.set("partitions", Json.toJson(DatasetsDAO.getDatasetPartitionGains(datasetId)));
return ok(result);
}
@ -837,53 +835,43 @@ public class Dataset extends Controller
{
ObjectNode result = Json.newObject();
result.put("status", "ok");
result.put("access", Json.toJson(DatasetsDAO.getDatasetAccessibilty(datasetId)));
result.set("access", Json.toJson(DatasetsDAO.getDatasetAccessibilty(datasetId)));
return ok(result);
}
public static F.Promise<Result> getDatasetSecurity(int datasetId) {
final String backendUrl = Play.application().configuration().getString(BACKEND_SERVICE_URL_KEY);
final String queryUrl = backendUrl + DATASET_SECURITY_PATH;
final F.Promise<Result> resultPromise = WS.url(queryUrl)
public static Promise<Result> getDatasetSecurity(int datasetId) {
final String queryUrl = BACKEND_URL + DATASET_SECURITY_PATH;
return WS.url(queryUrl)
.setQueryParameter("datasetId", Integer.toString(datasetId))
.setRequestTimeout(1000)
.get()
.map(new F.Function<WS.Response, Result>() {
public Result apply(WS.Response response) {
return ok(response.asJson());
}
});
return resultPromise;
.map(response ->
ok(response.asJson())
);
}
public static F.Promise<Result> updateDatasetSecurity(int datasetId) {
public static Promise<Result> updateDatasetSecurity(int datasetId) {
String username = session("user");
if (StringUtils.isNotBlank(username)) {
final String backendUrl = Play.application().configuration().getString(BACKEND_SERVICE_URL_KEY);
final String queryUrl = backendUrl + DATASET_SECURITY_PATH;
final String queryUrl = BACKEND_URL + DATASET_SECURITY_PATH;
final ObjectNode queryNode = Json.newObject();
queryNode.put("datasetId", datasetId);
queryNode.put("securitySpec", request().body().asJson());
final JsonNode queryNode = Json.newObject()
.put("datasetId", datasetId)
.set("securitySpec", request().body().asJson());
final F.Promise<Result> resultPromise = WS.url(queryUrl)
return WS.url(queryUrl)
.setRequestTimeout(1000)
.post(queryNode)
.map(new F.Function<WS.Response, Result>() {
public Result apply(WS.Response response) {
return ok(response.asJson());
}
});
return resultPromise;
.map(response ->
ok(response.asJson())
);
} else {
ObjectNode result = Json.newObject();
result.put("status", "failed");
result.put("error", "true");
result.put("msg", "Unauthorized User.");
final JsonNode result = Json.newObject()
.put("status", "failed")
.put("error", "true")
.put("msg", "Unauthorized User.");
return F.Promise.promise(new F.Function0<Result>() {
public Result apply() {
return ok(result);
}
});
return Promise.promise(() -> ok(result));
}
}
}

View File

@ -28,9 +28,9 @@ import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import play.Logger;
import play.Play;
import play.libs.F;
import play.libs.F.Promise;
import play.libs.Json;
import play.libs.WS;
import play.libs.ws.*;
import java.util.*;
@ -94,7 +94,6 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
"a.app_code, f.flow_id, f.flow_name, f.flow_path, f.flow_group FROM flow f " +
"JOIN cfg_application a on f.app_id = a.app_id ";
public final static String ADV_SEARCH_JOB = "SELECT SQL_CALC_FOUND_ROWS " +
"a.app_code, f.flow_name, f.flow_path, f.flow_group, j.flow_id, j.job_id, " +
"j.job_name, j.job_path, j.job_type " +
@ -109,11 +108,9 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
"metric_formula, dimensions, owners, tags, urn, metric_url, wiki_url, scm_url, 0 as watch_id " +
"FROM dict_business_metric ";
public static List<String> getDatasetSources()
{
return getJdbcTemplate().queryForList(GET_DATASET_SOURCES, String.class);
return getJdbcTemplate().queryForList(GET_DATASET_SOURCES, String.class);
}
public static List<String> getDatasetScopes()
@ -206,7 +203,7 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
{
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<Dataset> pagedDatasets = new ArrayList<Dataset>();
List<Dataset> pagedDatasets = new ArrayList<>();
ObjectNode queryNode = Json.newObject();
queryNode.put("from", (page-1)*size);
queryNode.put("size", size);
@ -215,12 +212,12 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
if (searchNode != null && searchNode.isContainerNode())
{
queryNode.put("query", searchNode);
queryNode.set("query", searchNode);
}
F.Promise < WS.Response> responsePromise = WS.url(
Promise<WSResponse> responsePromise = WS.url(
Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_DATASET_URL_KEY)).post(queryNode);
JsonNode responseNode = responsePromise.get().asJson();
JsonNode responseNode = responsePromise.get(1000).asJson();
resultNode.put("page", page);
resultNode.put("category", "Datasets");
@ -277,7 +274,7 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
{
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<Metric> pagedMetrics = new ArrayList<Metric>();
List<Metric> pagedMetrics = new ArrayList<>();
ObjectNode queryNode = Json.newObject();
queryNode.put("from", (page-1)*size);
queryNode.put("size", size);
@ -286,12 +283,12 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
if (searchNode != null && searchNode.isContainerNode())
{
queryNode.put("query", searchNode);
queryNode.set("query", searchNode);
}
F.Promise < WS.Response> responsePromise = WS.url(Play.application().configuration().getString(
Promise<WSResponse> responsePromise = WS.url(Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_METRIC_URL_KEY)).post(queryNode);
JsonNode responseNode = responsePromise.get().asJson();
JsonNode responseNode = responsePromise.get(1000).asJson();
resultNode.put("page", page);
resultNode.put("category", "Metrics");
@ -363,7 +360,7 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
{
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<FlowJob> pagedFlows = new ArrayList<FlowJob>();
List<FlowJob> pagedFlows = new ArrayList<>();
ObjectNode queryNode = Json.newObject();
queryNode.put("from", (page-1)*size);
queryNode.put("size", size);
@ -372,12 +369,12 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
if (searchNode != null && searchNode.isContainerNode())
{
queryNode.put("query", searchNode);
queryNode.set("query", searchNode);
}
F.Promise < WS.Response> responsePromise = WS.url(Play.application().configuration().getString(
Promise<WSResponse> responsePromise = WS.url(Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_FLOW_URL_KEY)).post(queryNode);
JsonNode responseNode = responsePromise.get().asJson();
JsonNode responseNode = responsePromise.get(1000).asJson();
resultNode.put("page", page);
resultNode.put("category", "Flows");
@ -1435,6 +1432,7 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
boolean jobNeedAndKeyword = false;
if (jobInList.size() > 0)
{
query += "( ";
int indexForJobInList = 0;
for (String job : jobInList)
{
@ -1473,6 +1471,7 @@ public class AdvSearchDAO extends AbstractMySQLOpenSourceDAO
}
query += ") ";
}
query += " ) ";
}
query += " LIMIT " + (page-1)*size + ", " + size;

View File

@ -39,11 +39,8 @@ import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import play.Logger;
import play.Play;
import play.libs.F;
import play.libs.Json;
import models.*;
import play.libs.WS;
import utils.Lineage;
public class DatasetsDAO extends AbstractMySQLOpenSourceDAO
{

View File

@ -28,11 +28,11 @@ import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import play.Logger;
import play.Play;
import play.libs.F;
import play.libs.F.Promise;
import play.libs.Json;
import play.libs.ws.*;
import play.cache.Cache;
import models.*;
import play.libs.WS;
public class SearchDAO extends AbstractMySQLOpenSourceDAO
{
@ -202,15 +202,15 @@ public class SearchDAO extends AbstractMySQLOpenSourceDAO
if (keywordNode != null)
{
queryNode.put("query", keywordNode);
F.Promise < WS.Response> responsePromise = WS.url(Play.application().configuration().getString(
queryNode.set("query", keywordNode);
Promise<WSResponse> responsePromise = WS.url(Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_DATASET_URL_KEY)).post(queryNode);
responseNode = responsePromise.get().asJson();
responseNode = responsePromise.get(1000).asJson();
}
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<Dataset> pagedDatasets = new ArrayList<Dataset>();
List<Dataset> pagedDatasets = new ArrayList<>();
resultNode.put("page", page);
resultNode.put("category", category);
resultNode.put("source", source);
@ -303,15 +303,15 @@ public class SearchDAO extends AbstractMySQLOpenSourceDAO
if (keywordNode != null)
{
queryNode.put("query", keywordNode);
F.Promise < WS.Response> responsePromise = WS.url(Play.application().configuration().getString(
queryNode.set("query", keywordNode);
Promise<WSResponse> responsePromise = WS.url(Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_METRIC_URL_KEY)).post(queryNode);
responseNode = responsePromise.get().asJson();
responseNode = responsePromise.get(1000).asJson();
}
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<Metric> pagedMetrics = new ArrayList<Metric>();
List<Metric> pagedMetrics = new ArrayList<>();
resultNode.put("page", page);
resultNode.put("category", category);
resultNode.put("isMetrics", true);
@ -413,15 +413,15 @@ public class SearchDAO extends AbstractMySQLOpenSourceDAO
if (keywordNode != null)
{
queryNode.put("query", keywordNode);
F.Promise < WS.Response> responsePromise = WS.url(Play.application().configuration().getString(
queryNode.set("query", keywordNode);
Promise<WSResponse> responsePromise = WS.url(Play.application().configuration().getString(
SearchDAO.ELASTICSEARCH_FLOW_URL_KEY)).post(queryNode);
responseNode = responsePromise.get().asJson();
responseNode = responsePromise.get(1000).asJson();
}
ObjectNode resultNode = Json.newObject();
Long count = 0L;
List<FlowJob> pagedFlowJobs = new ArrayList<FlowJob>();
List<FlowJob> pagedFlowJobs = new ArrayList<>();
resultNode.put("page", page);
resultNode.put("category", category);
resultNode.put("isFlowJob", true);

View File

@ -72,7 +72,8 @@ public class AuthenticationManager {
Play.application().configuration().getString(LDAP_SEARCH_BASE_KEY).split("\\s*\\|\\s*");
DirContext ctx = null;
for (int i = 0; i < ldapUrls.length; i++) {
int i;
for (i = 0; i < ldapUrls.length; i++) {
try {
Hashtable<String, String> env =
buildEnvContext(userName, password, contextFactories, ldapUrls[i], principalDomains[i]);
@ -84,9 +85,10 @@ public class AuthenticationManager {
break;
} catch (NamingException e) {
// Logger.error("Ldap authentication failed for user " + userName + " - " + principalDomains[i] + " - " + ldapUrls[i], e);
UserDAO.insertLoginHistory(userName, "LDAP", "FAILURE", ldapUrls[i] + e.getMessage());
// if exhausted all ldap options and can't authenticate user
if (i >= ldapUrls.length - 1) {
UserDAO.insertLoginHistory(userName, "LDAP", "FAILURE", e.getMessage());
throw e;
}
} catch (SQLException e) {
@ -99,7 +101,7 @@ public class AuthenticationManager {
}
}
}
UserDAO.insertLoginHistory(userName, "LDAP", "SUCCESS", null);
UserDAO.insertLoginHistory(userName, "LDAP", "SUCCESS", ldapUrls[i]);
}
private static Hashtable<String, String> buildEnvContext(String username, String password, String contextFactory,

View File

@ -88,7 +88,7 @@ public class SampleData
if (dataNode != null && dataNode.isArray()) {
for (JsonNode node : dataNode) {
JsonNode valueNode = node.get(i);
((ArrayNode)node).set(i, new TextNode("********"));
((ArrayNode) node).set(i, new TextNode("********"));
}
}
}
@ -105,31 +105,31 @@ public class SampleData
{
if (sampleRowNode.has(MEMBER_ID))
{
((ObjectNode)sampleRowNode).put(MEMBER_ID, new TextNode("********"));
((ObjectNode) sampleRowNode).set(MEMBER_ID, new TextNode("********"));
}
if (sampleRowNode.has(TREE_ID))
{
JsonNode treeIdNode = sampleRowNode.get(TREE_ID);
String convertedValue = convertToHexString(treeIdNode);
((ObjectNode) sampleRowNode).put(TREE_ID, new TextNode(convertedValue));
((ObjectNode) sampleRowNode).set(TREE_ID, new TextNode(convertedValue));
}
if (sampleRowNode.has(TRACKING_ID))
{
JsonNode trackingIdNode = sampleRowNode.get(TRACKING_ID);
String convertedValue = convertToHexString(trackingIdNode);
((ObjectNode) sampleRowNode).put(TRACKING_ID, new TextNode(convertedValue));
((ObjectNode) sampleRowNode).set(TRACKING_ID, new TextNode(convertedValue));
}
if (sampleRowNode.has(IP_AS_BYTES))
{
JsonNode ipNode = sampleRowNode.get(IP_AS_BYTES);
String convertedValue = convertToHexString(ipNode);
((ObjectNode) sampleRowNode).put(IP_AS_BYTES, new TextNode(convertedValue));
((ObjectNode) sampleRowNode).set(IP_AS_BYTES, new TextNode(convertedValue));
}
if (sampleRowNode.has(IP_AS_BYTES_1))
{
JsonNode ipNode = sampleRowNode.get(IP_AS_BYTES_1);
String convertedValue = convertToHexString(ipNode);
((ObjectNode) sampleRowNode).put(IP_AS_BYTES_1, new TextNode(convertedValue));
((ObjectNode) sampleRowNode).set(IP_AS_BYTES_1, new TextNode(convertedValue));
}
if (sampleRowNode.has(ATTACHMENTS))
{
@ -138,14 +138,14 @@ public class SampleData
{
JsonNode payloadNode = attachmentNode.get(PAYLOAD);
String value = "** " + Integer.toString(payloadNode.size()) + " bytes binary data **";
((ObjectNode) attachmentNode).put(PAYLOAD, new TextNode(value));
((ObjectNode) attachmentNode).set(PAYLOAD, new TextNode(value));
}
}
if (sampleRowNode.has(MEDIA))
{
JsonNode mediaNode = sampleRowNode.get(MEDIA);
String convertedValue = convertToHexString(mediaNode);
((ObjectNode) sampleRowNode).put(MEDIA, new TextNode(convertedValue));
((ObjectNode) sampleRowNode).set(MEDIA, new TextNode(convertedValue));
}
if (sampleRowNode.has(HEADER))
{
@ -154,19 +154,19 @@ public class SampleData
{
if (headerNode.has(MEMBER_ID))
{
((ObjectNode)headerNode).put(MEMBER_ID, new TextNode("********"));
((ObjectNode) headerNode).set(MEMBER_ID, new TextNode("********"));
}
if (headerNode.has(GUID))
{
JsonNode guidNode = headerNode.get(GUID);
String convertedValue = convertToHexString(guidNode);
((ObjectNode) headerNode).put(GUID, new TextNode(convertedValue));
((ObjectNode) headerNode).set(GUID, new TextNode(convertedValue));
}
if (headerNode.has(TREE_ID))
{
JsonNode headerTreeIdNode = headerNode.get(TREE_ID);
String convertedValue = convertToHexString(headerTreeIdNode);
((ObjectNode) headerNode).put(TREE_ID, new TextNode(convertedValue));
((ObjectNode) headerNode).set(TREE_ID, new TextNode(convertedValue));
}
if (headerNode.has(AUDITHEADER))
{
@ -175,7 +175,7 @@ public class SampleData
{
JsonNode messageIdNode = auditHeaderNode.get(MESSAGE_ID);
String convertedValue = convertToHexString(messageIdNode);
((ObjectNode) auditHeaderNode).put(MESSAGE_ID, new TextNode(convertedValue));
((ObjectNode) auditHeaderNode).set(MESSAGE_ID, new TextNode(convertedValue));
}
}
}
@ -190,9 +190,9 @@ public class SampleData
{
JsonNode payloadNode = attachmentsNode.get(PAYLOAD);
String value = "** " +
Integer.toString(payloadNode.size()) +
" bytes binary data **";
((ObjectNode) attachmentsNode).put(PAYLOAD, new TextNode(value));
Integer.toString(payloadNode.size()) +
" bytes binary data **";
((ObjectNode) attachmentsNode).set(PAYLOAD, new TextNode(value));
}
}
}
@ -203,13 +203,13 @@ public class SampleData
{
JsonNode ipNode = requestHeaderNode.get(IP_AS_BYTES);
String convertedValue = convertToHexString(ipNode);
((ObjectNode) requestHeaderNode).put(IP_AS_BYTES, new TextNode(convertedValue));
((ObjectNode) requestHeaderNode).set(IP_AS_BYTES, new TextNode(convertedValue));
}
if (requestHeaderNode != null && requestHeaderNode.has(IP_AS_BYTES_1))
{
JsonNode ipNode = requestHeaderNode.get(IP_AS_BYTES_1);
String convertedValue = convertToHexString(ipNode);
((ObjectNode) requestHeaderNode).put(IP_AS_BYTES_1, new TextNode(convertedValue));
((ObjectNode) requestHeaderNode).set(IP_AS_BYTES_1, new TextNode(convertedValue));
}
}
}

View File

@ -484,347 +484,6 @@
{{/if}}
</script>
<script type="text/x-handlebars" id="components/dataset-relations">
{{#if hasDepends}}
<table id="depends-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th class="span2">Depends on</th>
<th class="span1">Object Type</th>
<th class="span1">Object Sub Type</th>
<th class="span2">Level</th>
</tr>
</thead>
<tbody>
{{#each depends as |depend|}}
<tr class="{{depend.treeGridClass}}">
<td>
{{#if depend.isValidDataset}}
<a href={{depend.datasetLink}}>
{{depend.objectName}}
</a>
{{else}}
{{depend.objectName}}
{{/if}}
</td>
<td>{{depend.objectType}}</td>
<td>{{depend.objectSubType}}</td>
<td>{{depend.level}}</td>
</tr>
{{/each}}
</tbody>
</table>
{{/if}}
{{#if hasReferences}}
<table id="references-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th class="span2">Referred By</th>
<th class="span1">Object Type</th>
<th class="span1">Object Sub Type</th>
<th class="span2">Level</th>
</tr>
</thead>
<tbody>
{{#each references as |reference|}}
<tr class="{{reference.treeGridClass}}">
<td>
{{#if reference.isValidDataset}}
<a href={{reference.datasetLink}}>
{{reference.objectName}}
</a>
{{else}}
{{reference.objectName}}
{{/if}}
</td>
<td>{{reference.objectType}}</td>
<td>{{reference.objectSubType}}</td>
<td>{{reference.level}}</td>
</tr>
{{/each}}
</tbody>
</table>
{{/if}}
</script>
<script type="text/x-handlebars" id="components/dataset-access">
{{#if hasAccess}}
{{#each accessibilities as |accessibility|}}
<h4>Partition By: {{accessibility.partition}}</h4>
<table id="references-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th>Time</th>
{{#each accessibility.instanceList as |instance|}}
<th>{{instance}}</th>
{{/each}}
</tr>
</thead>
<tbody>
{{#each accessibility.accessibilityList as |access|}}
<tr>
<td>{{access.dataTimeExpr}}</td>
{{#each access.itemList as |item|}}
<td>
{{#unless item.isPlaceHolder}}
<div>
<p>Log time: {{item.logTimeEpochStr}}</p>
<p>Record count: {{item.recordCountStr}}</p>
</div>
{{/unless}}
</td>
{{/each}}
</tr>
{{/each}}
</tbody>
</table>
{{/each}}
{{else}}
<p>Accessibility is not available</p>
{{/if}}
</script>
<script type="text/x-handlebars" id="components/metric-detail">
<div id="metric" class="container-fluid">
<div class="row-fluid">
<div class="col-xs-6">
<h3>{{ model.name }}</h3>
</div>
<div class="col-xs-6 text-right">
<ul class="datasetDetailsLinks">
<li>
<i class="fa fa-share-alt"></i>
<span class="hidden-sm hidden-xs">
Share
</span>
</li>
<li>
{{#metric-watch metric=model showText=true getMetrics='getMetrics'}}
{{/metric-watch}}
</li>
{{#if showLineage}}
<li>
<a target="_blank" href={{lineageUrl}}>
<i class="fa fa-sitemap"></i>
<span class="hidden-sm hidden-xs">
View Lineage
</span>
</a>
</li>
{{/if}}
</ul>
</div>
<div class="col-xs-12">
Metric Description:
<a
href="#"
data-name="description"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter description"
data-emptytext="Please Input"
data-placeholder="Please Input"
>
{{model.description}}
</a>
</div>
</div>
<table class="tree table table-bordered">
<tbody>
<tr class="result">
<td class="span2" style="min-width:200px;">Dashboard Name</td>
<td>
<a
href="#"
data-name="dashboardName"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter dashboard name"
data-defaultValue="Please Input"
data-emptytext="Please Input"
data-value="{{model.dashboardName}}"
>
{{model.dashboardName}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Category</td>
<td>
<a
href="#"
data-name="category"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter metric category"
data-placement="right"
data-emptytext="Please Input"
>
{{model.category}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Group</td>
<td>
<a
href="#"
data-name="group"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter group"
data-placement="right"
data-emptytext="Please Input"
>
{{model.group}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Type</td>
<td>
<a
href="#"
data-name="refIDType"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter Type"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.refIDType}}
>
</a>
</td>
</tr>
<tr class="result">
<td>Metric Grain</td>
<td>
<a
href="#"
data-name="grain"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter grain"
data-placement="right"
data-emptytext="Please Input"
>
{{model.grain}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Formula</td>
<td>
{{ace-editor content=model.formula itemId=model.id savePath="/api/v1/metrics/{id}/update" saveParam="formula"}}
</td>
</tr>
<tr class="result">
<td>Metric Display Factor</td>
<td>
<a
href="#"
data-name="displayFactory"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter display factor"
data-placement="right"
data-emptytext="Please Input"
>
{{model.displayFactory}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Display Factor Sym</td>
<td>
<a
href="#"
data-name="displayFactorSym"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter display factor symbol"
data-placement="right"
data-emptytext="Please Input"
>
{{model.displayFactorSym}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Sub Category</td>
<td>
<a
href="#"
data-name="subCategory"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter sub category"
data-placement="right"
data-emptytext="Please Input"
>
{{model.subCategory}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Source</td>
<td>
<a
href="#"
data-name="source"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter source"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.source}}
>
</a>
</td>
</tr>
<tr class="result">
<td>Metric Source Type</td>
<td>
<a
href="#"
data-name="sourceType"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter source type"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.sourceType}}
>
</a>
</td>
</tr>
</tbody>
</table>
</div>
</script>
<script type="text/x-handlebars" id="dataset">
<div id="dataset" >
<div class="well well-sm">

View File

@ -815,6 +815,347 @@
</div><!-- /.modal -->
</script>
<script type="text/x-handlebars" id="components/dataset-relations">
{{#if hasDepends}}
<table id="depends-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th class="span2">Depends on</th>
<th class="span1">Object Type</th>
<th class="span1">Object Sub Type</th>
<th class="span2">Level</th>
</tr>
</thead>
<tbody>
{{#each depends as |depend|}}
<tr class="{{depend.treeGridClass}}">
<td>
{{#if depend.isValidDataset}}
<a href={{depend.datasetLink}}>
{{depend.objectName}}
</a>
{{else}}
{{depend.objectName}}
{{/if}}
</td>
<td>{{depend.objectType}}</td>
<td>{{depend.objectSubType}}</td>
<td>{{depend.level}}</td>
</tr>
{{/each}}
</tbody>
</table>
{{/if}}
{{#if hasReferences}}
<table id="references-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th class="span2">Referred By</th>
<th class="span1">Object Type</th>
<th class="span1">Object Sub Type</th>
<th class="span2">Level</th>
</tr>
</thead>
<tbody>
{{#each references as |reference|}}
<tr class="{{reference.treeGridClass}}">
<td>
{{#if reference.isValidDataset}}
<a href={{reference.datasetLink}}>
{{reference.objectName}}
</a>
{{else}}
{{reference.objectName}}
{{/if}}
</td>
<td>{{reference.objectType}}</td>
<td>{{reference.objectSubType}}</td>
<td>{{reference.level}}</td>
</tr>
{{/each}}
</tbody>
</table>
{{/if}}
</script>
<script type="text/x-handlebars" id="components/dataset-access">
{{#if hasAccess}}
{{#each accessibilities as |accessibility|}}
<h4>Partition By: {{accessibility.partition}}</h4>
<table id="references-table" class="columntreegrid tree table table-bordered dataset-detail-table">
<thead>
<tr class="results-header">
<th>Time</th>
{{#each accessibility.instanceList as |instance|}}
<th>{{instance}}</th>
{{/each}}
</tr>
</thead>
<tbody>
{{#each accessibility.accessibilityList as |access|}}
<tr>
<td>{{access.dataTimeExpr}}</td>
{{#each access.itemList as |item|}}
<td>
{{#unless item.isPlaceHolder}}
<div>
<p>Log time: {{item.logTimeEpochStr}}</p>
<p>Record count: {{item.recordCountStr}}</p>
</div>
{{/unless}}
</td>
{{/each}}
</tr>
{{/each}}
</tbody>
</table>
{{/each}}
{{else}}
<p>Accessibility is not available</p>
{{/if}}
</script>
<script type="text/x-handlebars" id="components/metric-detail">
<div id="metric" class="container-fluid">
<div class="row-fluid">
<div class="col-xs-6">
<h3>{{ model.name }}</h3>
</div>
<div class="col-xs-6 text-right">
<ul class="datasetDetailsLinks">
<li>
<i class="fa fa-share-alt"></i>
<span class="hidden-sm hidden-xs">
Share
</span>
</li>
<li>
{{#metric-watch metric=model showText=true getMetrics='getMetrics'}}
{{/metric-watch}}
</li>
{{#if showLineage}}
<li>
<a target="_blank" href={{lineageUrl}}>
<i class="fa fa-sitemap"></i>
<span class="hidden-sm hidden-xs">
View Lineage
</span>
</a>
</li>
{{/if}}
</ul>
</div>
<div class="col-xs-12">
Metric Description:
<a
href="#"
data-name="description"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter description"
data-emptytext="Please Input"
data-placeholder="Please Input"
>
{{model.description}}
</a>
</div>
</div>
<table class="tree table table-bordered">
<tbody>
<tr class="result">
<td class="span2" style="min-width:200px;">Dashboard Name</td>
<td>
<a
href="#"
data-name="dashboardName"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter dashboard name"
data-defaultValue="Please Input"
data-emptytext="Please Input"
data-value="{{model.dashboardName}}"
>
{{model.dashboardName}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Category</td>
<td>
<a
href="#"
data-name="category"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter metric category"
data-placement="right"
data-emptytext="Please Input"
>
{{model.category}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Group</td>
<td>
<a
href="#"
data-name="group"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter group"
data-placement="right"
data-emptytext="Please Input"
>
{{model.group}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Type</td>
<td>
<a
href="#"
data-name="refIDType"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter Type"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.refIDType}}
>
</a>
</td>
</tr>
<tr class="result">
<td>Metric Grain</td>
<td>
<a
href="#"
data-name="grain"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter grain"
data-placement="right"
data-emptytext="Please Input"
>
{{model.grain}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Formula</td>
<td>
{{ace-editor content=model.formula itemId=model.id savePath="/api/v1/metrics/{id}/update" saveParam="formula"}}
</td>
</tr>
<tr class="result">
<td>Metric Display Factor</td>
<td>
<a
href="#"
data-name="displayFactory"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter display factor"
data-placement="right"
data-emptytext="Please Input"
>
{{model.displayFactory}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Display Factor Sym</td>
<td>
<a
href="#"
data-name="displayFactorSym"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter display factor symbol"
data-placement="right"
data-emptytext="Please Input"
>
{{model.displayFactorSym}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Sub Category</td>
<td>
<a
href="#"
data-name="subCategory"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter sub category"
data-placement="right"
data-emptytext="Please Input"
>
{{model.subCategory}}
</a>
</td>
</tr>
<tr class="result">
<td>Metric Source</td>
<td>
<a
href="#"
data-name="source"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter source"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.source}}
>
</a>
</td>
</tr>
<tr class="result">
<td>Metric Source Type</td>
<td>
<a
href="#"
data-name="sourceType"
data-pk="{{model.id}}"
class="xeditable"
data-type="text"
data-placement="right"
data-title="Enter source type"
data-placement="right"
data-emptytext="Please Input"
data-value={{model.sourceType}}
>
</a>
</td>
</tr>
</tbody>
</table>
</div>
</script>
<script type="text/x-handlebars" id="components/schema-comment">
<i
class="fa fa-pencil pull-right wh-clickable-icon"

View File

@ -1,41 +1,15 @@
apply plugin: 'java'
apply plugin: 'scala'
apply plugin: 'idea'
def findPlayHome(){
project.ext.playHome = System.getenv()['PLAY_HOME']
project.ext.playHome = System.getenv()['ACTIVATOR_HOME']
if (null == project.ext.playHome) {
throw new GradleException('PLAY_HOME env variable not set!')
throw new GradleException('ACTIVATOR_HOME env variable not set!')
}
project.ext.playExec = "${playHome}/play"
project.ext.playExec = "${playHome}/bin/activator"
}
findPlayHome()
repositories{
mavenCentral()
// Play framework manages its own dependencies in a local Ivy repo
ivy{
def repoDir = "${playHome}/repository/local/"
url repoDir
ivyPattern "${repoDir}/[organisation]/[module]/[revision]/[type]s/[artifact].[ext]"
artifactPattern "${repoDir}/[organisation]/[module]/[revision]/[type]s/[artifact].[ext]"
layout 'pattern'
}
jcenter()
maven {
name "typesafe-maven-release"
url "https://repo.typesafe.com/typesafe/maven-releases"
}
ivy {
name "typesafe-ivy-release"
url "https://repo.typesafe.com/typesafe/ivy-releases"
layout "ivy"
}
}
configurations{
//Configuration containing sbt generated .class files
//This is needed for IDEs, because they cannot compile
@ -49,12 +23,11 @@ configurations{
dependencies{
// User defined libraries (will be copied to lib/ before `play compile`)
// compile 'group:name:0.1'
compile externalDependency.play
compile externalDependency.play_java_jdbc
compile externalDependency.play_ebean
compile externalDependency.play_java_ws
compile externalDependency.play_cache
compile externalDependency.spring_context
compile externalDependency.play_filter
compile externalDependency.spring_jdbc
compile externalDependency.mockito
@ -105,7 +78,7 @@ task "build" (type: Exec, dependsOn: playClean, overwrite: true) {
}
task "assemble" (type: Exec, dependsOn: playClean, overwrite: true) {
commandLine playExec, 'stage'
commandLine playExec, 'stage'
}
task "dist" (type: Exec, overwrite: true) {
@ -115,12 +88,3 @@ task "dist" (type: Exec, overwrite: true) {
task "check" (overwrite: true) {
// skip gradle check of this repository
}
/*
// optional: if using 'eclipse' plugin
eclipse {
classpath {
plusConfigurations += configurations.playManaged
plusConfigurations += configurations.provided
}
}
*/

View File

@ -2,16 +2,16 @@ name := "wherehows"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava, SbtTwirl)
scalaVersion := "2.10.6"
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
javaWs,
cache,
"mysql" % "mysql-connector-java" % "5.1.22",
"org.springframework" % "spring-context" % "4.1.1.RELEASE",
"org.springframework" % "spring-jdbc" % "4.1.1.RELEASE",
"org.mockito" % "mockito-core" % "1.9.5"
filters,
"mysql" % "mysql-connector-java" % "5.1.40",
"org.springframework" % "spring-jdbc" % "4.1.6.RELEASE",
"org.mockito" % "mockito-core" % "1.10.19"
)
libraryDependencies += filters
play.Project.playJavaSettings

View File

@ -4,18 +4,15 @@
# Secret key
# ~~~~~
# The secret key is used to secure cryptographics functions.
# If you deploy your application to several instances be sure to use the same key!
application.secret="`h2af<Nx<TaSfF82tv8Woeo52_l72iODF>G72cnrHDNmlRRpqyD_YgYb<aCea^9o"
#
# This must be changed for production, but we recommend not changing it in this file.
#
# See http://www.playframework.com/documentation/latest/ApplicationSecret for more details.
play.crypto.secret = "changeme"
# The application languages
# ~~~~~
application.langs="en"
# Global object class
# ~~~~~
# Define the Global object class for this application.
# Default to Global in the root package.
# application.global=Global
play.i18n.langs = [ "en" ]
# Router
# ~~~~~
@ -23,10 +20,10 @@ application.langs="en"
# This router will be looked up first when the application is starting up,
# so make sure this is the entry point.
# Furthermore, it's assumed your route file is named properly.
# So for an application router like `conf/my.application.Router`,
# you may need to define a router file `my.application.routes`.
# Default to Routes in the root package (and `conf/routes`)
# application.router=my.application.Routes
# So for an application router like `my.application.Router`,
# you may need to define a router file `conf/my.application.routes`.
# Default to Routes in the root package (and conf/routes)
# play.http.router = my.application.Routes
# Database configuration
# ~~~~~
@ -35,37 +32,16 @@ application.langs="en"
#
# db.default.driver=org.h2.Driver
# db.default.url="jdbc:h2:mem:play"
# db.default.user=sa
# db.default.username=sa
# db.default.password=""
#
# You can expose this datasource via JNDI if needed (Useful for JPA)
# db.default.jndiName=DefaultDS
# Evolutions
# ~~~~~
# You can disable evolutions if needed
# evolutionplugin=disabled
# play.evolutions.enabled=false
# Ebean configuration
# ~~~~~
# You can declare as many Ebean servers as you want.
# By convention, the default server is named `default`
#
# ebean.default="models.*"
# Logger
# ~~~~~
# You can also configure logback (http://logback.qos.ch/),
# by providing an application-logger.xml file in the conf directory.
# Root logger:
logger.root=ERROR
# Logger used by the framework:
logger.play=INFO
# Logger provided to your application:
logger.application=DEBUG
# You can disable evolutions for a specific datasource if necessary
# play.evolutions.db.default.enabled=false
datasets.tree.name = "/var/tmp/wherehows/resource/dataset.json"
flows.tree.name = "/var/tmp/wherehows/resource/flow.json"

View File

@ -1,15 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-4.1.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-4.1.xsd">
<context:component-scan base-package="controllers,security"/>
<bean name="AuthenticatedAction" class="play.mvc.Security.AuthenticatedAction" autowire-candidate="true"></bean>
</beans>

37
web/conf/logback.xml Normal file
View File

@ -0,0 +1,37 @@
<!--
Copyright 2015 LinkedIn Corp. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-->
<configuration>
<conversionRule conversionWord="coloredLevel" converterClass="play.api.Logger$ColoredLevel" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%coloredLevel - %logger - %message%n%xException</pattern>
</encoder>
</appender>
<!--
The logger name is typically the Java/Scala package name.
This configures the log level to log at for a package and its children packages.
-->
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
<root level="ERROR">
<appender-ref ref="STDOUT" />
</root>
</configuration>

View File

@ -1 +1 @@
sbt.version=0.13.5
sbt.version=0.13.12

View File

@ -5,4 +5,5 @@ logLevel := Level.Warn
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.4")
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.8")
addSbtPlugin("com.typesafe.sbt" % "sbt-twirl" % "1.1.1")

View File

@ -1,6 +1,3 @@
apply plugin: 'java'
version = '1.0'
dependencies {
compile externalDependency.slf4j_api
compile externalDependency.slf4j_log4j