Add base code
This commit is contained in:
commit
3167eb4b05
58 changed files with 9485 additions and 0 deletions
4
.gitignore
vendored
Normal file
4
.gitignore
vendored
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
.idea/
|
||||||
|
tools
|
||||||
|
*.patch
|
||||||
|
*.diff
|
4
CONTRIBUTING.md
Normal file
4
CONTRIBUTING.md
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
Contributing to Unikraft
|
||||||
|
=======================
|
||||||
|
|
||||||
|
Please refer to the `CONTRIBUTING.md` file in the main Unikraft repository.
|
38
COPYING.md
Normal file
38
COPYING.md
Normal file
|
@ -0,0 +1,38 @@
|
||||||
|
License
|
||||||
|
=======
|
||||||
|
|
||||||
|
Unikraft Tools
|
||||||
|
------------------------
|
||||||
|
|
||||||
|
This repository contains tools related to the Unikraft project. The code
|
||||||
|
is published as a mixture of BSD and MIT licences; each go code file in
|
||||||
|
this repository should declare who is the copyright owner and under which terms
|
||||||
|
and conditions the code is licensed. If such a licence note is missing, the
|
||||||
|
following copyright notice will apply:
|
||||||
|
|
||||||
|
Copyright (c) 2019, Université de Liege., ULiège. All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions
|
||||||
|
are met:
|
||||||
|
|
||||||
|
1. Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer in the
|
||||||
|
documentation and/or other materials provided with the distribution.
|
||||||
|
3. Neither the name of the copyright holder nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||||
|
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||||
|
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||||
|
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
|
||||||
|
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
|
||||||
|
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
|
||||||
|
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
||||||
|
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
|
||||||
|
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
||||||
|
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||||
|
POSSIBILITY OF SUCH DAMAGE.
|
9
MAINTAINERS.md
Normal file
9
MAINTAINERS.md
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
Maintainers List
|
||||||
|
================
|
||||||
|
|
||||||
|
For notes on how to read this information, please refer to `MAINTAINERS.md` in
|
||||||
|
the main Unikraft repository.
|
||||||
|
|
||||||
|
TOOLS-UNIKRAFT
|
||||||
|
M: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
F: *
|
39
README.md
Normal file
39
README.md
Normal file
|
@ -0,0 +1,39 @@
|
||||||
|
Unikraft Tools
|
||||||
|
==============================
|
||||||
|
|
||||||
|
Unikraft is an automated system for building specialized OSes and
|
||||||
|
unikernels tailored to the needs of specific applications. It is based
|
||||||
|
around the concept of small, modular libraries, each providing a part
|
||||||
|
of the functionality commonly found in an operating system (e.g.,
|
||||||
|
memory allocation, scheduling, filesystem support, network stack,
|
||||||
|
etc.).
|
||||||
|
|
||||||
|
This repo contains all tools related to Unikraft, and in particular
|
||||||
|
the main.go which acts as a single point of entry for all Unikraft
|
||||||
|
operations, including the downloading, building and running
|
||||||
|
of Unikraft applications.
|
||||||
|
|
||||||
|
Note that this repo in general, is currently under heavy development
|
||||||
|
and should not yet be used unless you know what you are doing. As things
|
||||||
|
stabilize, we will update this file to reflect this.
|
||||||
|
|
||||||
|
# Toolchain
|
||||||
|
|
||||||
|
Welcome to the Unikraft tools wiki!
|
||||||
|
|
||||||
|
The Unikraft tools are a set of tools to automatically build images of operating systems targeting applications. The toolchain will include the following tools:
|
||||||
|
1. **Decomposition tool** to assist developers in breaking existing monolithic software into smaller components.
|
||||||
|
2. **Dependency analysis tool** to analyse existing, unmodified applications to determine which set of libraries and OS primitives are absolutely necessary for correct execution.
|
||||||
|
3. **Automatic build tool** to match the requirements derived by the dependency analysis tools to the available libraries constructed by the OS decomposition tools. This one is composed of two components: a static analysis and a dynamic analysis.
|
||||||
|
4. **Verification tool** to ensure that the functionality of the resulting, specialized OS+application matches that of the application running on a standard OS. The tool will also take care of ensuring software quality.
|
||||||
|
5. **Performance optimization tool** to analyse the running specialized OS+application and to use this information as input to the automatic build tools so that they can generate even more optimized images.
|
||||||
|
|
||||||
|
For now only the **Dependency analysis tool** and the **Automatic build tool** are available.
|
||||||
|
|
||||||
|
## Installation and documentation
|
||||||
|
|
||||||
|
For installation and documentation, a wiki is available on this [address](https://github.com/gaulthiergain/tools/wiki).
|
||||||
|
|
||||||
|
## Contribute
|
||||||
|
|
||||||
|
Unikraft tools is an open source project (under MIT license) and is currently hosted at https://github.com/gaulthiergain/tools. You are encouraged to download the code, examine it, modify it, and submit bug reports, bug fixes, feature requests, new features and other issues and pull requests.
|
6
configfiles/nginx.txt
Normal file
6
configfiles/nginx.txt
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
nginx -h
|
||||||
|
nginx -v
|
||||||
|
nginx -V
|
||||||
|
nginx -t
|
||||||
|
nginx -T
|
||||||
|
nginx -Tq
|
42
srcs/Makefile
Normal file
42
srcs/Makefile
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
# Program arguments
|
||||||
|
BINARY_NAME ?= tools
|
||||||
|
BINARY_UNIX ?= $(BINARY_NAME)_unix
|
||||||
|
CONTAINER_NAME ?= unikraft/tools:latest
|
||||||
|
|
||||||
|
## Tools
|
||||||
|
DOCKER ?= docker
|
||||||
|
TARGET ?= binary
|
||||||
|
GO ?= go
|
||||||
|
GOBUILD ?= $(GO) build
|
||||||
|
GOCLEAN ?= $(GO) clean
|
||||||
|
GOTEST ?= $(GO) test
|
||||||
|
GOGET ?= $(GO) get
|
||||||
|
|
||||||
|
# Targets
|
||||||
|
all: build
|
||||||
|
container:
|
||||||
|
$(DOCKER) build \
|
||||||
|
-t $(CONTAINER_NAME) \
|
||||||
|
-f Dockerfile \
|
||||||
|
--target=$(TARGET) \
|
||||||
|
.
|
||||||
|
build:
|
||||||
|
$(GOBUILD) -o $(BINARY_NAME) -v
|
||||||
|
test:
|
||||||
|
$(GOTEST) -v ./...
|
||||||
|
clean:
|
||||||
|
$(GOCLEAN)
|
||||||
|
rm -f $(BINARY_NAME)
|
||||||
|
rm -f $(BINARY_UNIX)
|
||||||
|
run:
|
||||||
|
$(GOBUILD) -o $(BINARY_NAME) -v
|
||||||
|
./$(BINARY_NAME)
|
||||||
|
deps:
|
||||||
|
$(GOGET) github.com/fatih/color
|
||||||
|
$(GOGET) github.com/akamensky/argparse
|
||||||
|
$(GOGET) github.com/awalterschulze/gographviz
|
||||||
|
$(GOGET) github.com/sergi/go-diff/...
|
||||||
|
$(GOGET) github.com/AlecAivazis/survey
|
||||||
|
# Cross compilation
|
||||||
|
build-linux:
|
||||||
|
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 $(GOBUILD) -o $(BINARY_UNIX) -v
|
40
srcs/buildtool/args.go
Normal file
40
srcs/buildtool/args.go
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/akamensky/argparse"
|
||||||
|
"os"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
programArg = "program"
|
||||||
|
unikraftArg = "unikraft"
|
||||||
|
sourcesArg = "sources"
|
||||||
|
makefileArg = "makefile"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ParseArguments parses arguments of the application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func parseLocalArguments(p *argparse.Parser, args *u.Arguments) error {
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "p", programArg,
|
||||||
|
&argparse.Options{Required: true, Help: "Program name"})
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "u", unikraftArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Unikraft Path"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "s", sourcesArg,
|
||||||
|
&argparse.Options{Required: true, Help: "App Sources " +
|
||||||
|
"Folder"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "m", makefileArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Add additional properties " +
|
||||||
|
"for Makefile"})
|
||||||
|
|
||||||
|
return u.ParserWrapper(p, os.Args)
|
||||||
|
}
|
470
srcs/buildtool/kconfig_parser.go
Normal file
470
srcs/buildtool/kconfig_parser.go
Normal file
|
@ -0,0 +1,470 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
configLine = iota // Config <CONFIG_.*> = <value>
|
||||||
|
commentedConfigLine // Commented config: # <CONFIG_.*> is not set
|
||||||
|
headerLine // Header: # <.*>
|
||||||
|
separatorLine // Separator: #
|
||||||
|
lineFeed // Line FEED: \n
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported struct that represents a Kconfig entry.
|
||||||
|
type KConfig struct {
|
||||||
|
Config string
|
||||||
|
Value *string
|
||||||
|
Type int
|
||||||
|
}
|
||||||
|
|
||||||
|
// writeConfig writes a '.config' file for the Unikraft build system.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func writeConfig(filename string, items []*KConfig) error {
|
||||||
|
|
||||||
|
f, err := os.Create(filename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
for _, kConfig := range items {
|
||||||
|
|
||||||
|
var config string
|
||||||
|
switch kConfig.Type {
|
||||||
|
case configLine:
|
||||||
|
config = kConfig.Config + "=" + *kConfig.Value
|
||||||
|
case commentedConfigLine:
|
||||||
|
config = "# " + kConfig.Config + " is not set"
|
||||||
|
case headerLine:
|
||||||
|
config = kConfig.Config
|
||||||
|
case separatorLine:
|
||||||
|
config = "#"
|
||||||
|
case lineFeed:
|
||||||
|
config = "\n"
|
||||||
|
}
|
||||||
|
if _, err := f.Write([]byte(config + "\n")); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseConfig parses a '.config' file used by the Unikraft build system.
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig and an error if any, otherwise it returns nil.
|
||||||
|
func parseConfig(filename string, kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig, matchedLibs []string) ([]*KConfig, error) {
|
||||||
|
|
||||||
|
f, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
r := bufio.NewReader(f)
|
||||||
|
for {
|
||||||
|
line, err := r.ReadString(0x0A)
|
||||||
|
|
||||||
|
items = addKConfig(line, kConfigMap, items, matchedLibs)
|
||||||
|
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
} else if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return items, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// addKConfig adds a KConfig entry to adequate data structures.
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig. This list will be saved into a '.config'
|
||||||
|
// file.
|
||||||
|
func addKConfig(line string, kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig, matchedLibs []string) []*KConfig {
|
||||||
|
|
||||||
|
var config string
|
||||||
|
var value *string
|
||||||
|
var typeConfig int
|
||||||
|
|
||||||
|
switch {
|
||||||
|
case strings.HasPrefix(line, "#") && strings.Contains(line,
|
||||||
|
"CONFIG"): // Commented config: # <CONFIG_.*> is not set
|
||||||
|
|
||||||
|
split := strings.Fields(line)
|
||||||
|
config = split[1]
|
||||||
|
value = nil
|
||||||
|
typeConfig = commentedConfigLine
|
||||||
|
case strings.HasPrefix(line, "#") && len(line) > 2: // Separator: #
|
||||||
|
config = strings.TrimSuffix(line, "\n")
|
||||||
|
value = nil
|
||||||
|
typeConfig = headerLine
|
||||||
|
case strings.HasPrefix(line, "#") && len(line) == 2: // Header: # <.*>
|
||||||
|
config, value = "#", nil
|
||||||
|
typeConfig = separatorLine
|
||||||
|
case strings.Contains(line, "="): // Config: <CONFIG_.*> = y
|
||||||
|
split := strings.Split(line, "=")
|
||||||
|
config = split[0]
|
||||||
|
word := strings.TrimSuffix(split[1], "\n")
|
||||||
|
value = &word
|
||||||
|
typeConfig = configLine
|
||||||
|
default: // Line FEED
|
||||||
|
config, value = "#", nil
|
||||||
|
typeConfig = lineFeed
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create KConfig
|
||||||
|
kConfig := &KConfig{
|
||||||
|
config,
|
||||||
|
value,
|
||||||
|
typeConfig,
|
||||||
|
}
|
||||||
|
|
||||||
|
// If config is not a comment, perform additional procedures
|
||||||
|
if config != "#" {
|
||||||
|
kConfigMap[config] = kConfig
|
||||||
|
items = append(items, kConfigMap[config])
|
||||||
|
items = addInternalConfig(config, kConfigMap, items)
|
||||||
|
items = matchLibsKconfig(config, kConfigMap, items, matchedLibs)
|
||||||
|
} else {
|
||||||
|
items = append(items, kConfig)
|
||||||
|
}
|
||||||
|
|
||||||
|
return items
|
||||||
|
}
|
||||||
|
|
||||||
|
// updateConfig updates KConfig entries to particular values.
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig.
|
||||||
|
func updateConfig(kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig) []*KConfig {
|
||||||
|
v := "y"
|
||||||
|
var configs = []*KConfig{
|
||||||
|
// CONFIG libs
|
||||||
|
{"CONFIG_HAVE_BOOTENTRY", &v, configLine},
|
||||||
|
{"CONFIG_HAVE_SCHED", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKARGPARSE", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKBUS", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKSGLIST", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKTIMECONV", &v, configLine},
|
||||||
|
|
||||||
|
// CONFIG build
|
||||||
|
{"CONFIG_OPTIMIZE_NONE", &v, configLine},
|
||||||
|
{"CONFIG_OPTIMIZE_PERF", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
|
||||||
|
return SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetConfig updates a specific KConfig entry.
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig.
|
||||||
|
func SetConfig(newConfigs []*KConfig, kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig) []*KConfig {
|
||||||
|
|
||||||
|
for _, conf := range newConfigs {
|
||||||
|
// If kConfigMap does not contains the value, add it
|
||||||
|
if _, ok := kConfigMap[conf.Config]; !ok {
|
||||||
|
if len(conf.Config) > 1 {
|
||||||
|
kConfigMap[conf.Config] = conf
|
||||||
|
items = append(items, kConfigMap[conf.Config])
|
||||||
|
} else {
|
||||||
|
items = append(items, conf)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Update only
|
||||||
|
newConfiguration := kConfigMap[conf.Config]
|
||||||
|
newConfiguration.Value = conf.Value
|
||||||
|
newConfiguration.Type = conf.Type
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return items
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchLibsKconfig performs the matching between Kconfig entries and micro-libs
|
||||||
|
// and updates the right Kconfig
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig.
|
||||||
|
func matchLibsKconfig(conf string, kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig, matchedLibs []string) []*KConfig {
|
||||||
|
|
||||||
|
v := "y"
|
||||||
|
switch conf {
|
||||||
|
case "CONFIG_LIBPOSIX_PROCESS":
|
||||||
|
if u.Contains(matchedLibs, POSIXPROCESS) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBPOSIX_PROCESS", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBPOSIX_USER":
|
||||||
|
if u.Contains(matchedLibs, POSIXUSER) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBPOSIX_USER", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBSYSCALL_SHIM":
|
||||||
|
if u.Contains(matchedLibs, SYSCALLSHIM) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBSYSCALL_SHIM", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBUKTIME":
|
||||||
|
if u.Contains(matchedLibs, UKTIME) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKTIME", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_UKSYSINFO":
|
||||||
|
if u.Contains(matchedLibs, UKSYSINFO) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_UKSYSINFO", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_POSIX_LIBDL":
|
||||||
|
if u.Contains(matchedLibs, POSIXLIBDL) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_POSIX_LIBDL", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBVFSCORE":
|
||||||
|
if u.Contains(matchedLibs, VFSCORE) {
|
||||||
|
n := "16"
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBVFSCORE", &v, configLine},
|
||||||
|
{"CONFIG_LIBRAMFS", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBDEVFS", &v, configLine},
|
||||||
|
{"CONFIG_LIBDEVFS_USE_RAMFS", nil, commentedConfigLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"# vfscore configuration", nil, headerLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"CONFIG_LIBVFSCORE_PIPE_SIZE_ORDER", &n, configLine},
|
||||||
|
{"CONFIG_LIBVFSCORE_AUTOMOUNT_ROOTFS", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBNEWLIBC":
|
||||||
|
if u.Contains(matchedLibs, NEWLIB) {
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_HAVE_LIBC", &v, configLine},
|
||||||
|
{"CONFIG_LIBNEWLIBC", &v, configLine},
|
||||||
|
{"CONFIG_LIBNEWLIBM", &v, configLine},
|
||||||
|
{"CONFIG_LIBNEWLIBC_WANT_IO_C99_FORMATS", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBNEWLIBC_LINUX_ERRNO_EXTENSIONS", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBPTHREAD_EMBEDDED":
|
||||||
|
if u.Contains(matchedLibs, PTHREADEMBEDDED) {
|
||||||
|
number := "32"
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBPTHREAD_EMBEDDED", &v, configLine},
|
||||||
|
{"CONFIG_LIBPTHREAD_EMBEDDED_MAX_SIMUL_THREADS", &number, configLine},
|
||||||
|
{"CONFIG_LIBPTHREAD_EMBEDDED_MAX_TLS", &number, configLine},
|
||||||
|
{"CONFIG_LIBPTHREAD_EMBEDDED_UTEST", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
case "CONFIG_LIBLWIP":
|
||||||
|
if u.Contains(matchedLibs, LWIP) {
|
||||||
|
seed, queues := "23", "1"
|
||||||
|
mss, dnsMaxServer, dnsTableSize := "1460", "2", "32"
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_VIRTIO_NET", &v, configLine},
|
||||||
|
//
|
||||||
|
{"CONFIG_LIBUKMPI", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKMPI_MBOX", &v, configLine},
|
||||||
|
//
|
||||||
|
{"CONFIG_LIBUKSWRAND", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKSWRAND_MWC", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKSWRAND_INITIALSEED", &seed, configLine},
|
||||||
|
//
|
||||||
|
{"CONFIG_LIBUKNETDEV", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKNETDEV_MAXNBQUEUES", &queues, configLine},
|
||||||
|
{"CONFIG_LIBUKNETDEV_DISPATCHERTHREADS", &v, configLine},
|
||||||
|
//
|
||||||
|
{"CONFIG_LIBLWIP", &v, configLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"# Netif drivers", nil, headerLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"CONFIG_LWIP_UKNETDEV", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_AUTOIFACE", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_NOTHREADS", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_THREADS", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_HEAP", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_NETIF_EXT_STATUS_CALLBACK", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_NETIF_STATUS_PRINT", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_IPV4", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_IPV6", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_UDP", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_TCP", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_TCP_MSS", &mss, configLine},
|
||||||
|
{"CONFIG_LWIP_WND_SCALE", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_TCP_KEEPALIVE", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_TCP_TIMESTAMPS", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_ICMP", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_IGMP", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_SNMP", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_DHCP", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LWIP_DNS", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_DNS_MAX_SERVERS", &dnsMaxServer, configLine},
|
||||||
|
{"CONFIG_LWIP_DNS_TABLE_SIZE", &dnsTableSize, configLine},
|
||||||
|
{"CONFIG_LWIP_SOCKET", &v, configLine},
|
||||||
|
{"CONFIG_LWIP_DEBUG", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return items
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchLibsKconfig performs the matching between Kconfig entries and micro-libs
|
||||||
|
// and updates the right Kconfigs
|
||||||
|
//
|
||||||
|
// It returns a list of KConfig.
|
||||||
|
func addInternalConfig(conf string, kConfigMap map[string]*KConfig,
|
||||||
|
items []*KConfig) []*KConfig {
|
||||||
|
v := "y"
|
||||||
|
switch conf {
|
||||||
|
case "CONFIG_PLAT_XEN":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_PLAT_XEN", &v, configLine},
|
||||||
|
{"CONFIG_XEN_HVMLITE", nil, commentedConfigLine},
|
||||||
|
{"", nil, lineFeed},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"# Console Options", nil, headerLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"CONFIG_XEN_KERNEL_HV_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_XEN_KERNEL_EMG_CONSOLE", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_XEN_DEBUG_HV_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_XEN_DEBUG_EMG_CONSOLE", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_XEN_PV_BUILD_P2M", &v, configLine},
|
||||||
|
{"CONFIG_XEN_GNTTAB", &v, configLine},
|
||||||
|
{"CONFIG_XEN_XENBUS", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_PLAT_KVM":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_PLAT_KVM", &v, configLine},
|
||||||
|
{"", nil, lineFeed},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"# Console Options", nil, headerLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"CONFIG_KVM_KERNEL_SERIAL_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_KVM_KERNEL_VGA_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_KVM_DEBUG_SERIAL_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_KVM_DEBUG_VGA_CONSOLE", &v, configLine},
|
||||||
|
{"CONFIG_KVM_PCI", &v, configLine},
|
||||||
|
{"CONFIG_VIRTIO_BUS", &v, configLine},
|
||||||
|
{"", nil, lineFeed},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"# Virtio", nil, headerLine},
|
||||||
|
{"#", nil, separatorLine},
|
||||||
|
{"CONFIG_VIRTIO_PCI", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_VIRTIO_NET", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_PLAT_LINUXU":
|
||||||
|
heapSize := "4"
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_PLAT_LINUXU", &v, configLine},
|
||||||
|
{"CONFIG_LINUXU_DEFAULT_HEAPMB", &heapSize, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKBOOT":
|
||||||
|
var number = "60"
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKBOOT", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKBOOT_BANNER", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKBOOT_MAXNBARGS", &number, configLine},
|
||||||
|
{"CONFIG_LIBUKBOOT_INITALLOC", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTK", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTK_INFO", &v, configLine},
|
||||||
|
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTK_WARN", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTK_ERR", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTK_CRIT", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINTD", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_NOREDIR", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_REDIR_PRINTD", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_REDIR_PRINTK", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINT_TIME", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_PRINT_STACK", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_ENABLE_ASSERT", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKDEBUG_TRACEPOINTS", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBNOLIBC":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBNOLIBC", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBNOLIBC_UKDEBUG_ASSERT", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKALLOC":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKALLOC", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKALLOC_IFPAGES", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKALLOC_IFSTATS", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKALLOCBBUDDY", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKSCHED":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKSCHED", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKSCHEDCOOP", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKMPI":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKMPI", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKMPI_MBOX", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKSWRAND":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKSWRAND_MWC", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKSWRAND_INITIALSEED", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_DEV_RANDOM", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKNETDEV":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKNETDEV_MAXNBQUEUES", nil, commentedConfigLine},
|
||||||
|
{"CONFIG_LIBUKNETDEV_DISPATCHERTHREADS", nil, commentedConfigLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
case "CONFIG_LIBUKLOCK":
|
||||||
|
configs := []*KConfig{
|
||||||
|
{"CONFIG_LIBUKLOCK", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKLOCK_SEMAPHORE", &v, configLine},
|
||||||
|
{"CONFIG_LIBUKLOCK_MUTEX", &v, configLine},
|
||||||
|
}
|
||||||
|
items = SetConfig(configs, kConfigMap, items)
|
||||||
|
}
|
||||||
|
|
||||||
|
return items
|
||||||
|
}
|
161
srcs/buildtool/makefile_process.go
Normal file
161
srcs/buildtool/makefile_process.go
Normal file
|
@ -0,0 +1,161 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ----------------------------Generate Makefile--------------------------------
|
||||||
|
|
||||||
|
// generateMakefile generates a 'Makefile' file for the Unikraft build system.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func generateMakefile(filename, unikraftPath, appFolder string,
|
||||||
|
matchedLibs []string, externalLibs map[string]string) error {
|
||||||
|
|
||||||
|
var sb strings.Builder
|
||||||
|
|
||||||
|
// Set unikraft root and libs workspace
|
||||||
|
sb.WriteString("UK_ROOT ?= " + unikraftPath + "unikraft\n" +
|
||||||
|
"UK_LIBS ?= " + unikraftPath + "libs\n")
|
||||||
|
|
||||||
|
var libC = ""
|
||||||
|
// Add external libs
|
||||||
|
sb.WriteString("LIBS := ")
|
||||||
|
if len(matchedLibs) > 0 {
|
||||||
|
for _, lib := range matchedLibs {
|
||||||
|
// Only write external libs
|
||||||
|
if _, ok := externalLibs[lib]; ok {
|
||||||
|
if strings.Compare(NEWLIB, lib) == 0 ||
|
||||||
|
strings.Compare(MUSL, lib) == 0 {
|
||||||
|
libC = lib
|
||||||
|
} else {
|
||||||
|
sb.WriteString("$(UK_LIBS)/" + lib + ":")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write libC at the end to avoid conflicts
|
||||||
|
if len(libC) > 0 {
|
||||||
|
sb.WriteString("$(UK_LIBS)/" + libC)
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.WriteString("\n\n")
|
||||||
|
|
||||||
|
// Bind UK_ROOT to make
|
||||||
|
sb.WriteString("all:\n" +
|
||||||
|
"\t@make -C $(UK_ROOT) A=" + appFolder + " L=$(LIBS)\n\n" +
|
||||||
|
"$(MAKECMDGOALS):\n" +
|
||||||
|
"\t@make -C $(UK_ROOT) A=" + appFolder + " L=$(LIBS) $(MAKECMDGOALS)\n")
|
||||||
|
|
||||||
|
// Save the content to Makefile
|
||||||
|
return u.WriteToFile(filename, []byte(sb.String()))
|
||||||
|
}
|
||||||
|
|
||||||
|
// typeFile determines the type of a given file.
|
||||||
|
//
|
||||||
|
// It returns a string that represents the used language.
|
||||||
|
func typeFile(filename string) string {
|
||||||
|
var extension = filepath.Ext(filename)
|
||||||
|
var flag string
|
||||||
|
switch extension {
|
||||||
|
case ".c":
|
||||||
|
flag = "C"
|
||||||
|
case ".cc":
|
||||||
|
case ".cpp":
|
||||||
|
flag = "CXX"
|
||||||
|
case ".S":
|
||||||
|
case ".asm":
|
||||||
|
flag = "AS"
|
||||||
|
}
|
||||||
|
return flag
|
||||||
|
}
|
||||||
|
|
||||||
|
// generateMakefileUK generates a 'Makefile.uk' file for the Unikraft build
|
||||||
|
// system.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func generateMakefileUK(filename, programName, filetype string,
|
||||||
|
makefileLines string, sourceFiles []string) error {
|
||||||
|
|
||||||
|
var sb strings.Builder
|
||||||
|
|
||||||
|
// Add app registration
|
||||||
|
sb.WriteString("########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"# App registration\n" +
|
||||||
|
"########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"$(eval $(call addlib,app" + strings.ToLower(programName) + "))\n\n")
|
||||||
|
|
||||||
|
// Add app includes (headers)
|
||||||
|
sb.WriteString("########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"# App includes\n" +
|
||||||
|
"########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"CINCLUDES-y += -I$(APP" + strings.ToUpper(programName) + "_BASE)" +
|
||||||
|
"/include\n\n")
|
||||||
|
|
||||||
|
// Add app global flags
|
||||||
|
sb.WriteString("########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"# Global flags\n" +
|
||||||
|
"########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"# Suppress some warnings to make the build process look neater\n" +
|
||||||
|
"SUPPRESS_FLAGS += -Wno-unused-parameter " +
|
||||||
|
"-Wno-unused-variable -Wno-nonnull \\\n" +
|
||||||
|
"-Wno-unused-but-set-variable -Wno-unused-label " +
|
||||||
|
"-Wno-char-subscripts \\\n-Wno-unused-function " +
|
||||||
|
"-Wno-missing-field-initializers -Wno-uninitialized \\\n" +
|
||||||
|
"-Wno-array-bounds -Wno-maybe-uninitialized " +
|
||||||
|
"-Wno-pointer-sign -Wno-unused-value \\\n" +
|
||||||
|
"-Wno-unused-macros -Wno-parentheses " +
|
||||||
|
"-Wno-implicit-function-declaration \\\n" +
|
||||||
|
"-Wno-missing-braces -Wno-endif-labels " +
|
||||||
|
"-Wno-unused-but-set-variable \\\n" +
|
||||||
|
"-Wno-implicit-function-declaration -Wno-type-limits " +
|
||||||
|
"-Wno-sign-compare\n\n")
|
||||||
|
|
||||||
|
// Add SUPPRESS Flags
|
||||||
|
sb.WriteString("APP" + strings.ToUpper(programName) + "_" +
|
||||||
|
typeFile(filetype) + "FLAGS-y +=" + "$(SUPPRESS_FLAGS)\n\n" +
|
||||||
|
"# ADD the flags of your app HERE\n\n")
|
||||||
|
|
||||||
|
// Add additional lines
|
||||||
|
if len(makefileLines) > 0 {
|
||||||
|
b, _ := u.OpenTextFile(makefileLines)
|
||||||
|
for _, line := range strings.Split(string(b), "\n") {
|
||||||
|
if len(line) > 0 {
|
||||||
|
sb.WriteString("APP" + strings.ToUpper(programName) +
|
||||||
|
"_CFLAGS-y += " + line + "\n")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add source files
|
||||||
|
sb.WriteString("########################################" +
|
||||||
|
"########################################\n" +
|
||||||
|
"# " + programName + "sources\n" +
|
||||||
|
"########################################" +
|
||||||
|
"########################################\n")
|
||||||
|
|
||||||
|
for _, s := range sourceFiles {
|
||||||
|
sb.WriteString("APP" + strings.ToUpper(programName) +
|
||||||
|
"_SRCS-y += $(APP" + strings.ToUpper(programName) +
|
||||||
|
"_BASE)/" + s + "\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save the content to Makefile.uk
|
||||||
|
return u.WriteToFile(filename, []byte(sb.String()))
|
||||||
|
}
|
68
srcs/buildtool/microlibs_definitions.go
Normal file
68
srcs/buildtool/microlibs_definitions.go
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
// Exported const representing micro-libs definitions.
|
||||||
|
const (
|
||||||
|
CARES = "c-ares"
|
||||||
|
COMPILERRT = "compiler-rt"
|
||||||
|
DEVFS = "devfs"
|
||||||
|
DPDK = "dpdk"
|
||||||
|
EIGEN = "eigen"
|
||||||
|
FP16 = "fp16"
|
||||||
|
FXDIV = "fxdiv"
|
||||||
|
INTELINTRINSICS = "intel-intrinsics"
|
||||||
|
LIBAXTLS = "libaxtls"
|
||||||
|
LIBBLOCK = "libblock"
|
||||||
|
LIBCXX = "libcxx"
|
||||||
|
LIBCXXABI = "libcxxabi"
|
||||||
|
LIBGO = "libgo"
|
||||||
|
LIBJVM = "libjvm"
|
||||||
|
LIBPROC = "libproc"
|
||||||
|
LIBRUBY = "libruby"
|
||||||
|
LIBRUST = "librust"
|
||||||
|
LIBUKSCHEDPREEMPT = "libukschedpreempt"
|
||||||
|
LIBUNWIND = "libunwind"
|
||||||
|
LIBUUID = "libuuid"
|
||||||
|
LIBUV = "libuv"
|
||||||
|
LIBV8 = "libv8"
|
||||||
|
LWIP = "lwip"
|
||||||
|
MICROPYTHON = "micropython"
|
||||||
|
MUSL = "musl"
|
||||||
|
NEWLIB = "newlib"
|
||||||
|
NOBLIM = "noblim"
|
||||||
|
NOLIBC = "nolibc"
|
||||||
|
OPENSSL = "openssl"
|
||||||
|
PFS9 = "9pfs"
|
||||||
|
POSIXLIBDL = "posix-libdl"
|
||||||
|
POSIXPROCESS = "posix-process"
|
||||||
|
POSIXUSER = "posix-user"
|
||||||
|
PTHREADEMBEDDED = "pthread-embedded"
|
||||||
|
PTHREADPOOL = "pthreadpool"
|
||||||
|
PYTHON = "python"
|
||||||
|
RAMFS = "ramfs"
|
||||||
|
SYSCALLSHIM = "syscallshim"
|
||||||
|
UKALLOC = "ukalloc"
|
||||||
|
UKALLOCBBUDDY = "ukallocbbuddy"
|
||||||
|
UKARGPARSE = "ukargparse"
|
||||||
|
UKBOOT = "ukboot"
|
||||||
|
UKBUS = "ukbus"
|
||||||
|
UKDEBUG = "ukdebug"
|
||||||
|
UKLOCK = "uklock"
|
||||||
|
UKMPI = "ukmpi"
|
||||||
|
UKNETDEV = "uknetdev"
|
||||||
|
UKPCI = "ukpci"
|
||||||
|
UKSCHED = "uksched"
|
||||||
|
UKSCHEDCOOP = "ukschedcoop"
|
||||||
|
UKSGLIST = "uksglist"
|
||||||
|
UKSYSINFO = "uksysinfo"
|
||||||
|
UKSWRAND = "ukswrand"
|
||||||
|
UKTIMECONV = "uktimeconv"
|
||||||
|
UKTIME = "uktime"
|
||||||
|
VFSCORE = "vfscore"
|
||||||
|
ZLIB = "zlib"
|
||||||
|
)
|
198
srcs/buildtool/microlibs_process.go
Normal file
198
srcs/buildtool/microlibs_process.go
Normal file
|
@ -0,0 +1,198 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io/ioutil"
|
||||||
|
"strings"
|
||||||
|
"sync"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
exportFile = "exportsyms.uk"
|
||||||
|
prefixUrl = "http://xenbits.xen.org/gitweb/?p=unikraft/libs/"
|
||||||
|
suffixUrl = ";a=blob_plain;f=exportsyms.uk;hb=refs/heads/staging"
|
||||||
|
)
|
||||||
|
|
||||||
|
// -----------------------------Match micro-libs--------------------------------
|
||||||
|
|
||||||
|
// processSymbols adds symbols within the 'exportsyms.uk' file into a map.
|
||||||
|
//
|
||||||
|
func processSymbols(microLib, output string, mapSymbols map[string][]string) {
|
||||||
|
|
||||||
|
lines := strings.Split(output, "\n")
|
||||||
|
for _, line := range lines {
|
||||||
|
if len(line) > 0 && !strings.Contains(line, "#") &&
|
||||||
|
strings.Compare(line, "none") != 0 {
|
||||||
|
mapSymbols[line] = append(mapSymbols[line], microLib)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// fetchSymbolsInternalLibs fetches all symbols within 'exportsyms.uk' files
|
||||||
|
// from Unikraft's internal libs and add them into a map.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func fetchSymbolsInternalLibs(unikraftLibs string,
|
||||||
|
microLibs map[string][]string) error {
|
||||||
|
|
||||||
|
// Read files within the Unikraft directory
|
||||||
|
files, err := ioutil.ReadDir(unikraftLibs)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read Unikraft internal libs symbols (exportsyms.uk)
|
||||||
|
for _, f := range files {
|
||||||
|
if f.IsDir() {
|
||||||
|
export := unikraftLibs + f.Name() + u.SEP + exportFile
|
||||||
|
if exists, _ := u.Exists(export); exists {
|
||||||
|
u.PrintInfo("Retrieving symbols of internal lib: " + f.Name())
|
||||||
|
b, _ := u.OpenTextFile(export)
|
||||||
|
processSymbols(f.Name(), string(b), microLibs)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// fetchSymbolsExternalLibs fetches all symbols within 'exportsyms.uk' files
|
||||||
|
// from Unikraft's external libs and add them into a map.
|
||||||
|
//
|
||||||
|
// It returns a list of symbols and an error if any, otherwise it returns nil.
|
||||||
|
func fetchSymbolsExternalLibs(url string,
|
||||||
|
microLibs map[string][]string) (map[string]string, error) {
|
||||||
|
|
||||||
|
var externalLibs map[string]string
|
||||||
|
if body, err := u.DownloadFile(url); err != nil {
|
||||||
|
return nil, err
|
||||||
|
} else {
|
||||||
|
externalLibs = u.GitFindExternalLibs(*body)
|
||||||
|
|
||||||
|
var wg sync.WaitGroup
|
||||||
|
wg.Add(len(externalLibs))
|
||||||
|
// Iterate through all external libs to parse 'exportsyms.uk' file
|
||||||
|
for lib, git := range externalLibs {
|
||||||
|
// Use go routine to get better efficiency
|
||||||
|
go func(lib, git string, microLibs map[string][]string) {
|
||||||
|
defer wg.Done()
|
||||||
|
u.PrintInfo("Retrieving symbols of external lib: " + lib)
|
||||||
|
if symbols, err := u.DownloadFile(prefixUrl + git + suffixUrl); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
processSymbols(lib, *symbols, microLibs)
|
||||||
|
}
|
||||||
|
}(lib, git, microLibs)
|
||||||
|
}
|
||||||
|
wg.Wait()
|
||||||
|
}
|
||||||
|
return externalLibs, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchSymbols performs the matching between Unikraft's micro-libs and
|
||||||
|
// libraries used by a given application based on the list of symbols that both
|
||||||
|
// contain.
|
||||||
|
//
|
||||||
|
// It returns a list of micro-libs that are required by the application
|
||||||
|
func matchSymbols(matchedLibs []string, data map[string]string,
|
||||||
|
microLibs map[string][]string) []string {
|
||||||
|
for key := range data {
|
||||||
|
if values, ok := microLibs[key]; ok {
|
||||||
|
for _, value := range values {
|
||||||
|
|
||||||
|
// todo remove
|
||||||
|
if strings.Compare(NOLIBC, value) == 0 {
|
||||||
|
value = NEWLIB
|
||||||
|
}
|
||||||
|
// remove above
|
||||||
|
|
||||||
|
if !u.Contains(matchedLibs, value) {
|
||||||
|
matchedLibs = append(matchedLibs, value)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return matchedLibs
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchLibs performs the matching between Unikraft's micro-libs and
|
||||||
|
// libraries used by a given application
|
||||||
|
//
|
||||||
|
// It returns a list of micro-libs that are required by the application and an
|
||||||
|
// error if any, otherwise it returns nil.
|
||||||
|
func matchLibs(unikraftLibs string, data *u.Data) ([]string, map[string]string, error) {
|
||||||
|
|
||||||
|
mapSymbols := make(map[string][]string)
|
||||||
|
|
||||||
|
matchedLibs := make([]string, 0)
|
||||||
|
if err := fetchSymbolsInternalLibs(unikraftLibs, mapSymbols); err != nil {
|
||||||
|
return nil, nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get list of libs from xenbits
|
||||||
|
url := "http://xenbits.xen.org/gitweb/?pf=unikraft/libs"
|
||||||
|
externalLibs, err := fetchSymbolsExternalLibs(url, mapSymbols)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Perform the matching symbols on static data
|
||||||
|
matchedLibs = matchSymbols(matchedLibs, data.StaticData.Symbols, mapSymbols)
|
||||||
|
|
||||||
|
// Perform the matching symbols on dynamic data
|
||||||
|
matchedLibs = matchSymbols(matchedLibs, data.DynamicData.Symbols, mapSymbols)
|
||||||
|
|
||||||
|
return matchedLibs, externalLibs, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// -----------------------------Clone micro-libs--------------------------------
|
||||||
|
|
||||||
|
// cloneGitRepo clones a specific git repository that hosts an external
|
||||||
|
// micro-libs on http://xenbits.xen.org/
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func cloneGitRepo(url, unikraftPathLibs string) error {
|
||||||
|
|
||||||
|
u.PrintInfo("Clone git repository " + url)
|
||||||
|
if _, _, err := u.GitCloneRepository(url, unikraftPathLibs, true); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
u.PrintOk("Git repository " + url + " has been cloned into " +
|
||||||
|
unikraftPathLibs)
|
||||||
|
|
||||||
|
u.PrintInfo("Git branch " + url)
|
||||||
|
if _, _, err := u.GitBranchStaging(unikraftPathLibs, true); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// cloneLibsFolders clones all the needed micro-libs that are needed by a
|
||||||
|
// given application
|
||||||
|
//
|
||||||
|
func cloneLibsFolders(unikraftPath string, matchedLibs []string,
|
||||||
|
externalLibs map[string]string) {
|
||||||
|
|
||||||
|
for _, lib := range matchedLibs {
|
||||||
|
if _, ok := externalLibs[lib]; ok {
|
||||||
|
exists, _ := u.Exists(unikraftPath + u.LIBSFOLDER + lib)
|
||||||
|
if !exists {
|
||||||
|
// If the micro-libs is not in the local host, clone it
|
||||||
|
if err := cloneGitRepo("git://xenbits.xen.org/unikraft/"+
|
||||||
|
"libs/"+lib+".git", unikraftPath+ u.LIBSFOLDER); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Library " + lib + " already exists in folder" +
|
||||||
|
unikraftPath + u.LIBSFOLDER)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
411
srcs/buildtool/run_buildtool.go
Normal file
411
srcs/buildtool/run_buildtool.go
Normal file
|
@ -0,0 +1,411 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/AlecAivazis/survey"
|
||||||
|
"io/ioutil"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// STATES
|
||||||
|
const (
|
||||||
|
compilerError = iota
|
||||||
|
linkingError
|
||||||
|
success
|
||||||
|
)
|
||||||
|
|
||||||
|
const pageSize = 10
|
||||||
|
// -----------------------------Generate Config---------------------------------
|
||||||
|
|
||||||
|
// generateConfigUk generates a 'Config.uk' file for the Unikraft build system.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func generateConfigUk(filename, programName string, matchedLibs []string) error {
|
||||||
|
|
||||||
|
var sb strings.Builder
|
||||||
|
sb.WriteString("### Invisible option for dependencies\n" +
|
||||||
|
"config APP" + programName + "_DEPENDENCIES\n" + "\tbool\n" +
|
||||||
|
"\tdefault y\n")
|
||||||
|
|
||||||
|
for _, lib := range matchedLibs {
|
||||||
|
sb.WriteString("\tselect " + lib + "\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save the content to Makefile.uk
|
||||||
|
return u.WriteToFile(filename, []byte(sb.String()))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------Process make output-------------------------------
|
||||||
|
|
||||||
|
// checkMakeOutput checks if errors or warning are displayed during the
|
||||||
|
// execution of the 'make' command.
|
||||||
|
//
|
||||||
|
// It returns an integer that defines the result of 'make':
|
||||||
|
// <SUCCESS, LINKING_ERROR, COMPILER_ERROR>
|
||||||
|
func checkMakeOutput(appFolder string, stderr *string) int {
|
||||||
|
|
||||||
|
if stderr == nil {
|
||||||
|
return success
|
||||||
|
}
|
||||||
|
|
||||||
|
// Linking errors during make
|
||||||
|
if strings.Contains(*stderr, "undefined") {
|
||||||
|
|
||||||
|
str := parseMakeOutput(*stderr)
|
||||||
|
if len(str) > 0 {
|
||||||
|
if err := u.WriteToFile(appFolder+"stub.c", []byte(str)); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return linkingError
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compiler errors during make
|
||||||
|
if strings.Contains(*stderr, "error:") {
|
||||||
|
|
||||||
|
return compilerError
|
||||||
|
}
|
||||||
|
|
||||||
|
return success
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseMakeOutput parses the output of the 'make' command.
|
||||||
|
//
|
||||||
|
// It returns a string that contains stubs of undefined function(s).
|
||||||
|
func parseMakeOutput(output string) string {
|
||||||
|
|
||||||
|
var sb strings.Builder
|
||||||
|
sb.WriteString("#include <stdio.h>\n")
|
||||||
|
|
||||||
|
undefinedSymbols := make(map[string]*string)
|
||||||
|
var re = regexp.MustCompile(`(?mi).*undefined reference to\s\x60(.*)'`)
|
||||||
|
for _, match := range re.FindAllStringSubmatch(output, -1) {
|
||||||
|
if _, ok := undefinedSymbols[match[1]]; !ok {
|
||||||
|
sb.WriteString("void ")
|
||||||
|
sb.WriteString(match[1])
|
||||||
|
sb.WriteString("(void){\n\tprintf(\"STUB\\n\");\n}\n\n")
|
||||||
|
undefinedSymbols[match[1]] = nil
|
||||||
|
u.PrintInfo("Add stub to function: " + match[1])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return sb.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// -------------------------------------Run-------------------------------------
|
||||||
|
|
||||||
|
// RunBuildTool runs the automatic build tool to build a unikernel of a
|
||||||
|
// given application.
|
||||||
|
//
|
||||||
|
func RunBuildTool(homeDir string, data *u.Data) {
|
||||||
|
|
||||||
|
// Init and parse local arguments
|
||||||
|
args := new(u.Arguments)
|
||||||
|
p, err := args.InitArguments()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
if err := parseLocalArguments(p, args); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get program Name
|
||||||
|
programName := *args.StringArg[programArg]
|
||||||
|
|
||||||
|
// Take base path if absolute path is used
|
||||||
|
if filepath.IsAbs(programName) {
|
||||||
|
programName = filepath.Base(programName)
|
||||||
|
}
|
||||||
|
|
||||||
|
var unikraftPath string
|
||||||
|
if len(*args.StringArg[unikraftArg]) == 0 {
|
||||||
|
path, err := setUnikraftFolder(homeDir + u.SEP)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
unikraftPath = *path
|
||||||
|
} else {
|
||||||
|
unikraftPath = *args.StringArg[unikraftArg]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if sources argument is set
|
||||||
|
if len(*args.StringArg[sourcesArg]) == 0 {
|
||||||
|
u.PrintErr("sources argument '-s' must be set")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if the unikraft folder contains the 3 required folders
|
||||||
|
if _, err := ioutil.ReadDir(unikraftPath); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
} else {
|
||||||
|
path, err := setUnikraftSubFolders(homeDir + u.SEP + u.UNIKRAFTFOLDER)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
unikraftPath = *path
|
||||||
|
}
|
||||||
|
|
||||||
|
// If data is not initialized, read output from dependency analysis tool
|
||||||
|
if data == nil {
|
||||||
|
u.PrintInfo("Initialize data")
|
||||||
|
outFolder := homeDir + u.SEP + programName + "_" + u.OUTFOLDER
|
||||||
|
if data, err = u.ReadDataJson(outFolder+programName, data); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create unikraft application path
|
||||||
|
appFolderPtr, err := createUnikraftApp(programName, unikraftPath)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
appFolder := *appFolderPtr
|
||||||
|
|
||||||
|
// Create the folder 'include' if it does not exist
|
||||||
|
includeFolder, err := createIncludeFolder(appFolder)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get sources files
|
||||||
|
sourcesPath := *args.StringArg[sourcesArg]
|
||||||
|
|
||||||
|
// Copy all .h into the include folder
|
||||||
|
sourceFiles, includesFiles := make([]string, 0), make([]string, 0)
|
||||||
|
|
||||||
|
// Move source files to Unikraft folder
|
||||||
|
if sourceFiles, err = processSourceFiles(sourcesPath, appFolder, *includeFolder,
|
||||||
|
sourceFiles, includesFiles); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter source files to limit build errors (e.g., remove test files,
|
||||||
|
//multiple main file, ...)
|
||||||
|
filterSourceFiles := filterSourcesFiles(sourceFiles)
|
||||||
|
|
||||||
|
// Prompt file selection
|
||||||
|
prompt := &survey.MultiSelect{
|
||||||
|
Message: "Select the sources of the program",
|
||||||
|
Options: sourceFiles,
|
||||||
|
Default: filterSourceFiles,
|
||||||
|
PageSize: pageSize,
|
||||||
|
}
|
||||||
|
|
||||||
|
var selectedFiles []string
|
||||||
|
if err := survey.AskOne(prompt, &selectedFiles); err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Match micro-libs
|
||||||
|
matchedLibs, externalLibs, err := matchLibs(unikraftPath+"unikraft"+u.SEP+
|
||||||
|
"lib"+u.SEP, data)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clone the external git repositories
|
||||||
|
cloneLibsFolders(unikraftPath, matchedLibs, externalLibs)
|
||||||
|
|
||||||
|
// Match internal dependencies between micro-libs
|
||||||
|
if err := searchInternalDependencies(unikraftPath, &matchedLibs,
|
||||||
|
externalLibs); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, lib := range matchedLibs {
|
||||||
|
u.PrintOk("Match lib: " + lib)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clone the external git repositories (if changed)
|
||||||
|
cloneLibsFolders(unikraftPath, matchedLibs, externalLibs)
|
||||||
|
|
||||||
|
// Generate Makefiles
|
||||||
|
if err := generateMake(programName, appFolder, unikraftPath, *args.StringArg[makefileArg],
|
||||||
|
matchedLibs, selectedFiles, externalLibs); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete Build folder
|
||||||
|
deleteBuildFolder(appFolder)
|
||||||
|
|
||||||
|
// Initialize config files
|
||||||
|
initConfig(appFolder, matchedLibs)
|
||||||
|
|
||||||
|
// Run make
|
||||||
|
runMake(programName, appFolder)
|
||||||
|
}
|
||||||
|
|
||||||
|
func searchInternalDependencies(unikraftPath string, matchedLibs *[]string,
|
||||||
|
externalLibs map[string]string) error {
|
||||||
|
|
||||||
|
for _, lib := range *matchedLibs {
|
||||||
|
// Consider only external libs
|
||||||
|
if _, ok := externalLibs[lib]; ok {
|
||||||
|
|
||||||
|
// Get and read Config.UK from external lib
|
||||||
|
configUk := unikraftPath + u.LIBSFOLDER + lib + u.SEP + "Config.uk"
|
||||||
|
lines, err := u.ReadLinesFile(configUk)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process Config.UK file
|
||||||
|
mapConfig := make(map[string][]string)
|
||||||
|
u.ProcessConfigUK(lines, false, mapConfig, nil)
|
||||||
|
|
||||||
|
for config := range mapConfig {
|
||||||
|
|
||||||
|
// Remove LIB prefix
|
||||||
|
if strings.Contains(config, "LIB") {
|
||||||
|
config = strings.TrimPrefix(config, "LIB")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Replace underscore by dash
|
||||||
|
if strings.Contains(config, "_") {
|
||||||
|
config = strings.ReplaceAll(config, "_", "-")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if matchedLibs already contains the lib
|
||||||
|
config = strings.ToLower(config)
|
||||||
|
if !u.Contains(*matchedLibs, config) {
|
||||||
|
*matchedLibs = append(*matchedLibs, config)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func generateMake(programName, appFolder, unikraftPath, makefile string,
|
||||||
|
matchedLibs, sourceFiles []string, externalLibs map[string]string) error {
|
||||||
|
// Generate Makefile
|
||||||
|
if err := generateMakefile(appFolder+"Makefile", unikraftPath,
|
||||||
|
appFolder, matchedLibs, externalLibs); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate Config.uk
|
||||||
|
if err := generateConfigUk(appFolder+"Config.uk",
|
||||||
|
strings.ToUpper(programName), matchedLibs); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the file type for Unikraft flag
|
||||||
|
fileType := languageUsed()
|
||||||
|
|
||||||
|
// Generate Makefile.uk
|
||||||
|
if err := generateMakefileUK(appFolder+"Makefile.uk", programName,
|
||||||
|
fileType, makefile, sourceFiles); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func deleteBuildFolder(appFolder string) {
|
||||||
|
// Delete build folder if already exists
|
||||||
|
if file, err := u.OSReadDir(appFolder); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
for _, f := range file {
|
||||||
|
if f.IsDir() && f.Name() == "build" {
|
||||||
|
u.PrintWarning("build folder already exists. Delete it.")
|
||||||
|
if err := os.RemoveAll(appFolder + "build"); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func initConfig(appFolder string, matchedLibs []string) {
|
||||||
|
|
||||||
|
// Run make allNoConfig to generate a .config file
|
||||||
|
if strOut, strErr, err := u.ExecuteWaitCommand(appFolder, "make",
|
||||||
|
"allnoconfig"); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
} else if len(*strErr) > 0 {
|
||||||
|
u.PrintErr("error during generating .config: " + *strErr)
|
||||||
|
} else if len(*strOut) > 0 && !strings.Contains(*strOut,
|
||||||
|
"configuration written") {
|
||||||
|
u.PrintWarning("Default .config cannot be generated")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse .config
|
||||||
|
kConfigMap := make(map[string]*KConfig)
|
||||||
|
items := make([]*KConfig, 0)
|
||||||
|
items, err := parseConfig(appFolder+".config", kConfigMap, items,
|
||||||
|
matchedLibs)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update .config
|
||||||
|
items = updateConfig(kConfigMap, items)
|
||||||
|
|
||||||
|
// Write .config
|
||||||
|
if err := writeConfig(appFolder+".config", items); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func runMake(programName, appFolder string) {
|
||||||
|
// Run make
|
||||||
|
stdout, stderr, _ := u.ExecuteRunCmd("make", appFolder, true)
|
||||||
|
|
||||||
|
// Check the state of the make command
|
||||||
|
state := checkMakeOutput(appFolder, stderr)
|
||||||
|
if state == linkingError {
|
||||||
|
|
||||||
|
// Add new stub.c in Makefile.uk
|
||||||
|
d := "APP" + strings.ToUpper(programName) +
|
||||||
|
"_SRCS-y += $(APP" + strings.ToUpper(programName) +
|
||||||
|
"_BASE)/stub.c"
|
||||||
|
if err := u.UpdateFile(appFolder+"Makefile.uk", []byte(d)); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run make a second time
|
||||||
|
stdout, stderr, _ = u.ExecuteRunCmd("make", appFolder, true)
|
||||||
|
|
||||||
|
// Check the state of the make command
|
||||||
|
checkMakeOutput(appFolder, stderr)
|
||||||
|
}
|
||||||
|
|
||||||
|
out := appFolder + programName
|
||||||
|
|
||||||
|
// Save make output into warnings.txt if warnings are here
|
||||||
|
if stderr != nil && strings.Contains(*stderr, "warning:") {
|
||||||
|
if err := u.WriteToFile(out+"_warnings.txt", []byte(*stderr)); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Warnings are written in file: " + out + "_warnings.txt")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save make output into output.txt
|
||||||
|
if stdout != nil {
|
||||||
|
if err := u.WriteToFile(out+"_output.txt", []byte(*stdout)); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Output is written in file: " + out + "_output.txt")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if state == compilerError {
|
||||||
|
u.PrintErr("Fix compilation errors")
|
||||||
|
} else if state == success {
|
||||||
|
u.PrintOk("Unikernel created in Folder: 'build/'")
|
||||||
|
}
|
||||||
|
}
|
240
srcs/buildtool/unikraft_files_process.go
Normal file
240
srcs/buildtool/unikraft_files_process.go
Normal file
|
@ -0,0 +1,240 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package buildtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ---------------------------Create Include Folder-----------------------------
|
||||||
|
|
||||||
|
func createIncludeFolder(appFolder string) (*string, error) {
|
||||||
|
|
||||||
|
includeFolder := appFolder + u.INCLUDEFOLDER
|
||||||
|
if _, err := u.CreateFolder(includeFolder); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return &includeFolder, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ----------------------------Set UNIKRAFT Folders-----------------------------
|
||||||
|
func setUnikraftFolder(homeDir string) (*string, error) {
|
||||||
|
|
||||||
|
unikraftFolder := homeDir + u.UNIKRAFTFOLDER
|
||||||
|
|
||||||
|
created, err := u.CreateFolder(unikraftFolder)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if created {
|
||||||
|
setUnikraftSubFolders(unikraftFolder)
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Unikraft folder already exists")
|
||||||
|
return &unikraftFolder, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return &unikraftFolder, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func setUnikraftSubFolders(unikraftFolder string) (*string, error) {
|
||||||
|
|
||||||
|
u.PrintInfo("Create Unikraft folder with apps and libs subfolders")
|
||||||
|
|
||||||
|
// Create 'apps' and 'libs' subfolders
|
||||||
|
if _, err := u.CreateFolder(unikraftFolder + u.APPSFOLDER); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := u.CreateFolder(unikraftFolder + u.LIBSFOLDER); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Download git repo of unikraft
|
||||||
|
if _, _, err := u.GitCloneRepository("git://xenbits.xen.org/unikraft/unikraft.git",
|
||||||
|
unikraftFolder, true); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use staging branch
|
||||||
|
if _, _, err := u.GitBranchStaging(unikraftFolder+"unikraft", true); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return &unikraftFolder, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------Check UNIKRAFT Folder-----------------------------
|
||||||
|
|
||||||
|
func containsUnikraftFolders(files []os.FileInfo) bool {
|
||||||
|
|
||||||
|
if len(files) == 0 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
m := make(map[string]bool)
|
||||||
|
m[u.APPSFOLDER], m[u.LIBSFOLDER], m[u.UNIKRAFTFOLDER] = false, false, false
|
||||||
|
|
||||||
|
var folderName string
|
||||||
|
for _, f := range files {
|
||||||
|
folderName = f.Name() + u.SEP
|
||||||
|
if _, ok := m[folderName]; ok {
|
||||||
|
m[folderName] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return m[u.APPSFOLDER] == true && m[u.LIBSFOLDER] && m[u.UNIKRAFTFOLDER]
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------UNIKRAFT APP FOLDER-------------------------------
|
||||||
|
|
||||||
|
func createUnikraftApp(programName, unikraftPath string) (*string, error) {
|
||||||
|
|
||||||
|
var appFolder string
|
||||||
|
if unikraftPath[len(unikraftPath)-1] != os.PathSeparator {
|
||||||
|
appFolder = unikraftPath + u.SEP + u.APPSFOLDER + programName + u.SEP
|
||||||
|
} else {
|
||||||
|
appFolder = unikraftPath + u.APPSFOLDER + programName + u.SEP
|
||||||
|
}
|
||||||
|
|
||||||
|
created, err := u.CreateFolder(appFolder)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if !created {
|
||||||
|
u.PrintWarning(appFolder + " already exists.")
|
||||||
|
handleCreationApp(&appFolder)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &appFolder, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// -----------------------------Create App folder-------------------------------
|
||||||
|
|
||||||
|
func handleCreationApp(appFolder *string) {
|
||||||
|
fmt.Println("Make your choice:\n1: Copy and overwrite files\n2: " +
|
||||||
|
"Enter manually the name of the folder\n3: exit program")
|
||||||
|
var input int
|
||||||
|
for true {
|
||||||
|
fmt.Print("Please enter your choice (0 to exit): ")
|
||||||
|
if _, err := fmt.Scanf("%d", &input); err != nil {
|
||||||
|
u.PrintWarning("Choice must be numeric! Try again")
|
||||||
|
} else {
|
||||||
|
switch input {
|
||||||
|
case 1:
|
||||||
|
return
|
||||||
|
case 2:
|
||||||
|
fmt.Print("Enter text: ")
|
||||||
|
reader := bufio.NewReader(os.Stdin)
|
||||||
|
text, _ := reader.ReadString('\n')
|
||||||
|
appFolder = &text
|
||||||
|
return
|
||||||
|
case 3:
|
||||||
|
os.Exit(1)
|
||||||
|
default:
|
||||||
|
u.PrintWarning("Invalid input! Try again")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// -------------------------MOVE FILES TO APP FOLDER----------------------------
|
||||||
|
|
||||||
|
var srcLanguages = map[string]int{
|
||||||
|
".c": 0,
|
||||||
|
".cpp": 0,
|
||||||
|
".cc": 0,
|
||||||
|
".S": 0,
|
||||||
|
".s": 0,
|
||||||
|
".asm": 0,
|
||||||
|
".py": 0,
|
||||||
|
".go": 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
func filterSourcesFiles(sourceFiles []string) []string {
|
||||||
|
filterSrcFiles := make([]string, 0)
|
||||||
|
for _, file := range sourceFiles {
|
||||||
|
if !strings.Contains(file, "copy") &&
|
||||||
|
!strings.Contains(file, "test") &&
|
||||||
|
!strings.Contains(file, "unit") {
|
||||||
|
filterSrcFiles = append(filterSrcFiles, file)
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
return filterSrcFiles
|
||||||
|
}
|
||||||
|
|
||||||
|
func processSourceFiles(sourcesPath, appFolder, includeFolder string,
|
||||||
|
sourceFiles, includesFiles []string) ([]string, error) {
|
||||||
|
|
||||||
|
err := filepath.Walk(sourcesPath, func(path string, info os.FileInfo,
|
||||||
|
err error) error {
|
||||||
|
|
||||||
|
if !info.IsDir() {
|
||||||
|
|
||||||
|
extension := filepath.Ext(info.Name())
|
||||||
|
if _, ok := srcLanguages[extension]; ok {
|
||||||
|
// Add source files to sourceFiles list
|
||||||
|
sourceFiles = append(sourceFiles, info.Name())
|
||||||
|
|
||||||
|
// Count the number of extension
|
||||||
|
srcLanguages[extension] += 1
|
||||||
|
|
||||||
|
// Copy source files to the appFolder
|
||||||
|
if err = u.CopyFileContents(path, appFolder+info.Name()); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
} else if extension == ".h" {
|
||||||
|
// Add source files to includesFiles list
|
||||||
|
includesFiles = append(includesFiles, info.Name())
|
||||||
|
|
||||||
|
// Copy header files to the INCLUDEFOLDER
|
||||||
|
if err = u.CopyFileContents(path, includeFolder+info.Name()); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
u.PrintWarning("Unsupported extension for file: " + info.Name())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// If no source file, exit the program
|
||||||
|
if len(sourceFiles) == 0 {
|
||||||
|
return nil, errors.New("unable to find source files")
|
||||||
|
}
|
||||||
|
|
||||||
|
return sourceFiles, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func languageUsed() string {
|
||||||
|
|
||||||
|
max := -1
|
||||||
|
var mostUsedFiles string
|
||||||
|
for key, value := range srcLanguages {
|
||||||
|
if max < value {
|
||||||
|
max = value
|
||||||
|
mostUsedFiles = key
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return mostUsedFiles
|
||||||
|
}
|
123
srcs/common/arguments.go
Normal file
123
srcs/common/arguments.go
Normal file
|
@ -0,0 +1,123 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"github.com/akamensky/argparse"
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported constants to determine arguments type.
|
||||||
|
const (
|
||||||
|
INT = iota
|
||||||
|
BOOL
|
||||||
|
STRING
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported constants to determine which tool is used.
|
||||||
|
const (
|
||||||
|
CRAWLER = "crawler"
|
||||||
|
DEP = "dep"
|
||||||
|
BUILD = "build"
|
||||||
|
VERIF = "verif"
|
||||||
|
PERF = "perf"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
unknownArgs = "unknown arguments"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported constants to represent different types of arguments.
|
||||||
|
type Arguments struct {
|
||||||
|
IntArg map[string]*int
|
||||||
|
BoolArg map[string]*bool
|
||||||
|
StringArg map[string]*string
|
||||||
|
}
|
||||||
|
|
||||||
|
// InitArguments allows to initialize the parser in order to parse given
|
||||||
|
// arguments.
|
||||||
|
//
|
||||||
|
// It returns a parser as well as an error if any, otherwise it returns nil.
|
||||||
|
func (args *Arguments) InitArguments() (*argparse.Parser, error) {
|
||||||
|
|
||||||
|
args.IntArg = make(map[string]*int)
|
||||||
|
args.BoolArg = make(map[string]*bool)
|
||||||
|
args.StringArg = make(map[string]*string)
|
||||||
|
|
||||||
|
p := argparse.NewParser("UNICORE toolchain",
|
||||||
|
"The UNICORE toolchain allows to build unikernels")
|
||||||
|
|
||||||
|
return p, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParserWrapper parses arguments of the application and skips unknownArgs
|
||||||
|
// error in order to use different levels of arguments parsing.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func ParserWrapper(p *argparse.Parser, args []string) error {
|
||||||
|
|
||||||
|
err := p.Parse(args)
|
||||||
|
if err != nil && strings.Contains(err.Error(), unknownArgs) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseArguments parses arguments of the application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func (*Arguments) ParseMainArguments(p *argparse.Parser, args *Arguments) error {
|
||||||
|
|
||||||
|
if args == nil {
|
||||||
|
return errors.New("args structure should be initialized")
|
||||||
|
}
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, BOOL, "", CRAWLER,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Execute the crawler unikraft tool"})
|
||||||
|
args.InitArgParse(p, args, BOOL, "", DEP,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Execute only the dependency analysis tool"})
|
||||||
|
args.InitArgParse(p, args, BOOL, "", BUILD,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Execute only the automatic build tool"})
|
||||||
|
args.InitArgParse(p, args, BOOL, "", VERIF,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Execute only the verification tool"})
|
||||||
|
args.InitArgParse(p, args, BOOL, "", PERF,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Execute only the performance tool"})
|
||||||
|
|
||||||
|
// Parse only the two first arguments <program name, [tools]>
|
||||||
|
if len(os.Args) > 2 {
|
||||||
|
return ParserWrapper(p, os.Args[:2])
|
||||||
|
}else{
|
||||||
|
p.Parse(os.Args)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// InitArgParse initializes the Arguments structure depending the type of
|
||||||
|
// the variable.
|
||||||
|
func (*Arguments) InitArgParse(p *argparse.Parser, args *Arguments, typeVar int,
|
||||||
|
short, long string, options *argparse.Options) {
|
||||||
|
switch typeVar {
|
||||||
|
case INT:
|
||||||
|
args.IntArg[long] = new(int)
|
||||||
|
args.IntArg[long] = p.Int(short, long, options)
|
||||||
|
case BOOL:
|
||||||
|
args.BoolArg[long] = new(bool)
|
||||||
|
args.BoolArg[long] = p.Flag(short, long, options)
|
||||||
|
case STRING:
|
||||||
|
args.StringArg[long] = new(string)
|
||||||
|
args.StringArg[long] = p.String(short, long, options)
|
||||||
|
}
|
||||||
|
}
|
28
srcs/common/data.go
Normal file
28
srcs/common/data.go
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
// Exported struct that represents static and dynamic data.
|
||||||
|
type Data struct {
|
||||||
|
StaticData StaticData `json:"static_data"`
|
||||||
|
DynamicData DynamicData `json:"dynamic_data"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exported struct that represents data for static dependency analysis.
|
||||||
|
type StaticData struct {
|
||||||
|
Dependencies map[string][]string `json:"dependencies"`
|
||||||
|
SharedLibs map[string][]string `json:"shared_libs"`
|
||||||
|
SystemCalls map[string]string `json:"system_calls"`
|
||||||
|
Symbols map[string]string `json:"symbols"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exported struct that represents data for dynamic dependency analysis.
|
||||||
|
type DynamicData struct {
|
||||||
|
SharedLibs map[string][]string `json:"shared_libs"`
|
||||||
|
SystemCalls map[string]string `json:"system_calls"`
|
||||||
|
Symbols map[string]string `json:"symbols"`
|
||||||
|
}
|
177
srcs/common/file_process.go
Normal file
177
srcs/common/file_process.go
Normal file
|
@ -0,0 +1,177 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"io"
|
||||||
|
"io/ioutil"
|
||||||
|
"os"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported constants for folder management
|
||||||
|
const (
|
||||||
|
SEP = string(os.PathSeparator)
|
||||||
|
OUTFOLDER = "output" + SEP
|
||||||
|
PERM = 0755
|
||||||
|
)
|
||||||
|
|
||||||
|
// OpenTextFile opens a file named by filename.
|
||||||
|
//
|
||||||
|
// It returns a slice of bytes which represents its content and an error if
|
||||||
|
// any, otherwise it returns nil.
|
||||||
|
func OpenTextFile(filename string) ([]byte, error) {
|
||||||
|
file, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
return ioutil.ReadAll(file)
|
||||||
|
}
|
||||||
|
|
||||||
|
// UpdateFile updates a file named by filename by adding new bytes at the end.
|
||||||
|
//
|
||||||
|
// It returns a slice of bytes which represents its content and an error if
|
||||||
|
// any, otherwise it returns nil.
|
||||||
|
func UpdateFile(filename string, dataByte []byte) error {
|
||||||
|
input, err := ioutil.ReadFile(filename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
result := append(input, dataByte...)
|
||||||
|
|
||||||
|
return WriteToFile(filename, result)
|
||||||
|
}
|
||||||
|
|
||||||
|
// WriteToFile creates and writes bytes content to a new file named by filename.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func WriteToFile(filename string, dataByte []byte) error {
|
||||||
|
err := ioutil.WriteFile(filename, dataByte, PERM)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// OSReadDir reads the content of a folder named by root.
|
||||||
|
//
|
||||||
|
// It returns a slice of FileInfo values and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func OSReadDir(root string) ([]os.FileInfo, error) {
|
||||||
|
var files []os.FileInfo
|
||||||
|
f, err := os.Open(root)
|
||||||
|
if err != nil {
|
||||||
|
return files, err
|
||||||
|
}
|
||||||
|
fileInfo, err := f.Readdir(-1)
|
||||||
|
f.Close()
|
||||||
|
if err != nil {
|
||||||
|
return files, err
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, file := range fileInfo {
|
||||||
|
files = append(files, file)
|
||||||
|
}
|
||||||
|
return files, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exists checks if a given file exists.
|
||||||
|
//
|
||||||
|
// It returns true if the file exists and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func Exists(path string) (bool, error) {
|
||||||
|
_, err := os.Stat(path)
|
||||||
|
if err == nil {
|
||||||
|
return true, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if os.IsNotExist(err) {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return true, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateFolder creates a folder if it does not exist.
|
||||||
|
//
|
||||||
|
// It returns true if the folder is created and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func CreateFolder(path string) (bool, error) {
|
||||||
|
if _, err := os.Stat(path); os.IsNotExist(err) {
|
||||||
|
if err = os.Mkdir(path, PERM); err != nil {
|
||||||
|
return false, err
|
||||||
|
}
|
||||||
|
return true, nil
|
||||||
|
}
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReadLinesFile Reads a file line by line and saves its content into a slice.
|
||||||
|
//
|
||||||
|
// It returns a slice of string which represents each line of a file and an
|
||||||
|
// error if any, otherwise it returns nil.
|
||||||
|
func ReadLinesFile(path string) ([]string, error) {
|
||||||
|
|
||||||
|
f, err := os.Open(path)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
rd := bufio.NewReader(f)
|
||||||
|
|
||||||
|
var lines []string
|
||||||
|
for {
|
||||||
|
line, err := rd.ReadString('\n')
|
||||||
|
|
||||||
|
if len(line) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
// End of file, break the reading
|
||||||
|
if err == io.EOF {
|
||||||
|
lines = append(lines, line)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
lines = append(lines, line)
|
||||||
|
}
|
||||||
|
|
||||||
|
return lines, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// CopyFileContents copies the contents of the file named src to the file named
|
||||||
|
// by dst. The file will be created if it does not already exist. If the
|
||||||
|
// destination file exists, all it's contents will be replaced by the contents
|
||||||
|
// of the source file.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func CopyFileContents(src, dst string) (err error) {
|
||||||
|
in, err := os.Open(src)
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer in.Close()
|
||||||
|
out, err := os.Create(dst)
|
||||||
|
if err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer func() {
|
||||||
|
cErr := out.Close()
|
||||||
|
if err == nil {
|
||||||
|
err = cErr
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
if _, err = io.Copy(out, in); err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
err = out.Sync()
|
||||||
|
return
|
||||||
|
}
|
67
srcs/common/git.go
Normal file
67
srcs/common/git.go
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const branch = "staging"
|
||||||
|
|
||||||
|
// GitCloneRepository clones a git repository at the the given url.
|
||||||
|
//
|
||||||
|
// It returns two pointers of string which are respectively stdout and stderr
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func GitCloneRepository(url, dir string, v bool) (*string, *string, error) {
|
||||||
|
return ExecuteRunCmd("git", dir, v, "clone", url)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GitBranchStaging updates the current branch of a git repository to the
|
||||||
|
// 'staging' branch.
|
||||||
|
//
|
||||||
|
// It returns two pointers of string which are respectively stdout and stderr
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func GitBranchStaging(dir string, v bool) (*string, *string, error) {
|
||||||
|
strOut, strErr, err := ExecuteRunCmd("git", dir, v, "branch", "-r")
|
||||||
|
if err != nil {
|
||||||
|
return strOut, strErr, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.Contains(*strOut, branch) || strings.Contains(*strErr, branch) {
|
||||||
|
PrintInfo("Checkout to " + branch)
|
||||||
|
return ExecuteRunCmd("git", dir, v, "checkout", branch)
|
||||||
|
}
|
||||||
|
|
||||||
|
return strOut, strErr, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GitPull pulls the current git repository.
|
||||||
|
//
|
||||||
|
// It returns two pointers of string which are respectively stdout and stderr
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func GitPull(dir string, v bool) (*string, *string, error) {
|
||||||
|
return ExecuteRunCmd("git", dir, v, "pull")
|
||||||
|
}
|
||||||
|
|
||||||
|
// GitFindExternalLibs finds all the external libraries of Unikraft which are
|
||||||
|
// hosted on Xenbits.
|
||||||
|
//
|
||||||
|
// It returns a map of all the external libs of Unikraft.
|
||||||
|
func GitFindExternalLibs(output string) map[string]string {
|
||||||
|
var re = regexp.MustCompile(
|
||||||
|
`(?m)<a class="list"\s+href="(.*);a=summary">.*</a>`)
|
||||||
|
|
||||||
|
matches := re.FindAllStringSubmatch(output, -1)
|
||||||
|
externalLibs := make(map[string]string, len(matches))
|
||||||
|
for _, match := range matches {
|
||||||
|
git := strings.Split(match[1], "/")
|
||||||
|
lib := strings.Split(git[len(git)-1], ".git")
|
||||||
|
externalLibs[lib[0]] = git[len(git)-1]
|
||||||
|
}
|
||||||
|
return externalLibs
|
||||||
|
}
|
138
srcs/common/graph_process.go
Normal file
138
srcs/common/graph_process.go
Normal file
|
@ -0,0 +1,138 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"math/rand"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"github.com/awalterschulze/gographviz"
|
||||||
|
)
|
||||||
|
|
||||||
|
const letters = "0123456789ABCDEF"
|
||||||
|
|
||||||
|
// RandStringBytes generates random string of size n.
|
||||||
|
//
|
||||||
|
// It returns a random string of a particular length.
|
||||||
|
func RandStringBytes(n int) string {
|
||||||
|
|
||||||
|
b := make([]byte, n)
|
||||||
|
for i := range b {
|
||||||
|
b[i] = letters[rand.Intn(len(letters))]
|
||||||
|
}
|
||||||
|
return string(b)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ColorGenerator generates a color in RGB format.
|
||||||
|
//
|
||||||
|
// It returns a string which represents a random string formatted as RGB color.
|
||||||
|
func ColorGenerator() string {
|
||||||
|
return "#" + RandStringBytes(6)
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateGraphLabel creates a graph from a map.
|
||||||
|
//
|
||||||
|
// It returns a graph which represents all the direct and no-direct dependencies
|
||||||
|
// of a given application and an error if any, otherwise it returns nil.
|
||||||
|
func CreateGraphLabel(name string, data map[string][]string,
|
||||||
|
mapLabel map[string]string) (*gographviz.Escape, error) {
|
||||||
|
|
||||||
|
graph := gographviz.NewEscape()
|
||||||
|
|
||||||
|
if err := graph.SetName(name); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Directed graph
|
||||||
|
if err := graph.SetDir(true); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create graph from map
|
||||||
|
for key, values := range data {
|
||||||
|
|
||||||
|
colorsMap := map[string]string{}
|
||||||
|
|
||||||
|
// Generate a random color
|
||||||
|
if _, in := colorsMap[key]; !in {
|
||||||
|
colorsMap[key] = ColorGenerator()
|
||||||
|
}
|
||||||
|
|
||||||
|
attributes := map[string]string{"color": colorsMap[key]}
|
||||||
|
|
||||||
|
// Create nodes
|
||||||
|
if err := graph.AddNode("\""+key+"\"", "\""+key+"\"",
|
||||||
|
attributes); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if values != nil {
|
||||||
|
|
||||||
|
// Add edges
|
||||||
|
for _, v := range values {
|
||||||
|
|
||||||
|
if label, ok := mapLabel[v]; ok {
|
||||||
|
attributes["label"] = label
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := graph.AddEdge("\""+key+"\"", "\""+v+"\"", true,
|
||||||
|
attributes); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete label attributes if necessary
|
||||||
|
if _, ok := mapLabel[v]; ok {
|
||||||
|
delete(attributes, "label")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return graph, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SaveGraphToFile saves a given graph to a file.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func SaveGraphToFile(filename string, graph *gographviz.Escape) error {
|
||||||
|
file, err := os.Create(filename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
_, err = file.WriteString(graph.String())
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GenerateGraph generates a given graph to a '.dot' and '.png' files.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func GenerateGraph(programName, fullPathName string, data map[string][]string,
|
||||||
|
mapLabel map[string]string) {
|
||||||
|
// Create graph
|
||||||
|
graph, err := CreateGraphLabel(programName, data, mapLabel)
|
||||||
|
|
||||||
|
// Save graph as '.dot' file
|
||||||
|
if err = SaveGraphToFile(fullPathName+".dot", graph); err != nil {
|
||||||
|
PrintWarning(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save graph as '.png' file
|
||||||
|
if _, err := ExecuteCommand("dot", []string{"-Tpng",
|
||||||
|
fullPathName + ".dot", "-o", fullPathName + ".png"}); err != nil {
|
||||||
|
PrintWarning(err)
|
||||||
|
PrintWarning("Open the following website to display the graph:" +
|
||||||
|
" https://dreampuf.github.io/GraphvizOnline/")
|
||||||
|
} else {
|
||||||
|
PrintOk("Graph saved into " + fullPathName + ".png")
|
||||||
|
}
|
||||||
|
}
|
49
srcs/common/output_colors.go
Normal file
49
srcs/common/output_colors.go
Normal file
|
@ -0,0 +1,49 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"github.com/fatih/color"
|
||||||
|
"log"
|
||||||
|
)
|
||||||
|
|
||||||
|
// PrintHeader1 prints a big header formatted string on stdout.
|
||||||
|
func PrintHeader1(v ...interface{}) {
|
||||||
|
header := color.New(color.FgBlue, color.Bold, color.Underline).SprintFunc()
|
||||||
|
fmt.Printf("%v\n", header(v))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrintHeader2 prints a small header formatted string on stdout.
|
||||||
|
func PrintHeader2(v ...interface{}) {
|
||||||
|
magenta := color.New(color.FgMagenta).SprintFunc()
|
||||||
|
fmt.Printf("%v\n", magenta(v))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrintWarning prints a warning formatted string on stdout.
|
||||||
|
func PrintWarning(v ...interface{}) {
|
||||||
|
yellow := color.New(color.FgYellow, color.Bold).SprintFunc()
|
||||||
|
fmt.Printf("[%s] %v\n", yellow("WARNING"), v)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrintOk prints a success formatted string on stdout.
|
||||||
|
func PrintOk(v ...interface{}) {
|
||||||
|
green := color.New(color.FgGreen, color.Bold).SprintFunc()
|
||||||
|
fmt.Printf("[%s] %v\n", green("SUCCESS"), v)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrintInfo prints an info formatted string on stdout.
|
||||||
|
func PrintInfo(v ...interface{}) {
|
||||||
|
blue := color.New(color.FgBlue, color.Bold).SprintFunc()
|
||||||
|
fmt.Printf("[%s] %v\n", blue("INFO"), v)
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrintErr prints an error formatted string on stdout and exits the app.
|
||||||
|
func PrintErr(v ...interface{}) {
|
||||||
|
red := color.New(color.FgRed).SprintFunc()
|
||||||
|
log.Fatalf("[%s] %v\n", red("ERROR"), v)
|
||||||
|
}
|
257
srcs/common/process.go
Normal file
257
srcs/common/process.go
Normal file
|
@ -0,0 +1,257 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
"io/ioutil"
|
||||||
|
"os"
|
||||||
|
"os/exec"
|
||||||
|
"path/filepath"
|
||||||
|
"regexp"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"syscall"
|
||||||
|
"time"
|
||||||
|
"unicode"
|
||||||
|
)
|
||||||
|
|
||||||
|
const TIMEOUT = 5 //5 secs
|
||||||
|
|
||||||
|
// ExecutePipeCommand executes a piped command.
|
||||||
|
//
|
||||||
|
// It returns a string which represents stdout and an error if any, otherwise
|
||||||
|
// it returns nil.
|
||||||
|
func ExecutePipeCommand(command string) (string, error) {
|
||||||
|
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), TIMEOUT*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cmd := exec.CommandContext(ctx, "/bin/bash", "-c", command)
|
||||||
|
out, err := cmd.Output()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if ctx.Err() == context.DeadlineExceeded {
|
||||||
|
return string(out), errors.New("Time out during with: " + command)
|
||||||
|
}
|
||||||
|
|
||||||
|
return string(out), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExecuteRunCmd runs a command and display the output to stdout and stderr.
|
||||||
|
//
|
||||||
|
// It returns two pointers of string which are respectively stdout and stderr
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func ExecuteRunCmd(name, dir string, v bool, args ...string) (*string, *string,
|
||||||
|
error) {
|
||||||
|
|
||||||
|
cmd := exec.Command(name, args...)
|
||||||
|
cmd.Dir = dir
|
||||||
|
bufOut, bufErr := &bytes.Buffer{}, &bytes.Buffer{}
|
||||||
|
if v {
|
||||||
|
cmd.Stderr = io.MultiWriter(bufErr, os.Stderr)
|
||||||
|
cmd.Stdout = io.MultiWriter(bufOut, os.Stdout)
|
||||||
|
} else {
|
||||||
|
cmd.Stderr = bufErr
|
||||||
|
cmd.Stdout = bufOut
|
||||||
|
}
|
||||||
|
cmd.Stdin = os.Stdin
|
||||||
|
_ = cmd.Run()
|
||||||
|
|
||||||
|
strOut, strErr := bufOut.String(), bufErr.String()
|
||||||
|
|
||||||
|
return &strOut, &strErr, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExecuteRunCmdStdin runs a command and saves stdout and stderr as bytes.
|
||||||
|
//
|
||||||
|
// It returns two byte arrays which are respectively stdout and stderr
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func ExecuteRunCmdStdin(name string, stdinArgs []byte, args ...string) ([]byte,
|
||||||
|
[]byte) {
|
||||||
|
|
||||||
|
bufOut, bufErr := &bytes.Buffer{}, &bytes.Buffer{}
|
||||||
|
|
||||||
|
var buffer bytes.Buffer
|
||||||
|
if len(stdinArgs) > 0 {
|
||||||
|
buffer = bytes.Buffer{}
|
||||||
|
buffer.Write(stdinArgs)
|
||||||
|
}
|
||||||
|
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), TIMEOUT*time.Second)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
cmd := exec.CommandContext(ctx, name, args...)
|
||||||
|
if len(stdinArgs) > 0 {
|
||||||
|
cmd.Stdin = &buffer
|
||||||
|
}
|
||||||
|
cmd.Stdout = bufOut
|
||||||
|
cmd.Stderr = bufErr
|
||||||
|
|
||||||
|
_ = cmd.Run()
|
||||||
|
|
||||||
|
if ctx.Err() == context.DeadlineExceeded {
|
||||||
|
PrintWarning("Time out during executing: " + cmd.String())
|
||||||
|
return bufOut.Bytes(), bufErr.Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
return bufOut.Bytes(), bufErr.Bytes()
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExecuteCommand a single command without displaying the output.
|
||||||
|
//
|
||||||
|
// It returns a string which represents stdout and an error if any, otherwise
|
||||||
|
// it returns nil.
|
||||||
|
func ExecuteCommand(command string, arguments []string) (string, error) {
|
||||||
|
out, err := exec.Command(command, arguments...).CombinedOutput()
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return string(out), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ExecuteWaitCommand runs command and waits to its termination without
|
||||||
|
// displaying the output.
|
||||||
|
//
|
||||||
|
// It returns a string which represents stdout and an error if any, otherwise
|
||||||
|
// it returns nil.
|
||||||
|
func ExecuteWaitCommand(dir, command string, args ...string) (*string, *string,
|
||||||
|
error) {
|
||||||
|
|
||||||
|
cmd := exec.Command(command, args...)
|
||||||
|
cmd.Dir = dir
|
||||||
|
cmd.SysProcAttr = &syscall.SysProcAttr{Setpgid: true}
|
||||||
|
|
||||||
|
bufOut, bufErr := &bytes.Buffer{}, &bytes.Buffer{}
|
||||||
|
cmd.Stdout = io.MultiWriter(bufOut) // Add os.Stdin to record on stdout
|
||||||
|
cmd.Stderr = io.MultiWriter(bufErr) // Add os.Stdin to record on stderr
|
||||||
|
cmd.Stdin = os.Stdin
|
||||||
|
|
||||||
|
if err := cmd.Start(); err != nil {
|
||||||
|
return nil, nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
PrintInfo("Waiting command: " + command + " " + strings.Join(args, " "))
|
||||||
|
|
||||||
|
// Ignore error
|
||||||
|
_ = cmd.Wait()
|
||||||
|
|
||||||
|
strOut, strErr := bufOut.String(), bufErr.String()
|
||||||
|
|
||||||
|
return &strOut, &strErr, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// PKill kills a given running process with a particular signal
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func PKill(programName string, sig syscall.Signal) error {
|
||||||
|
if len(programName) == 0 {
|
||||||
|
return errors.New("program name should not be empty")
|
||||||
|
}
|
||||||
|
re, err := regexp.Compile(programName)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
pids := getPids(re)
|
||||||
|
if len(pids) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
current := os.Getpid()
|
||||||
|
for _, pid := range pids {
|
||||||
|
if current != pid {
|
||||||
|
_ = syscall.Kill(pid, sig)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// PidOf gets PIDs of a particular process.
|
||||||
|
//
|
||||||
|
// It returns a list of integer which represents the pids of particular process
|
||||||
|
// and an error if any, otherwise it returns nil.
|
||||||
|
func PidOf(name string) ([]int, error) {
|
||||||
|
if len(name) == 0 {
|
||||||
|
return []int{}, errors.New("name should not be empty")
|
||||||
|
}
|
||||||
|
re, err := regexp.Compile("(^|/)" + name + "$")
|
||||||
|
if err != nil {
|
||||||
|
return []int{}, err
|
||||||
|
}
|
||||||
|
return getPids(re), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// getPids gets PIDs of a particular process.
|
||||||
|
//
|
||||||
|
// It returns a list of integer which represents the pids of particular process.
|
||||||
|
func getPids(re *regexp.Regexp) []int {
|
||||||
|
var pids []int
|
||||||
|
|
||||||
|
dirFD, err := os.Open("/proc")
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
defer dirFD.Close()
|
||||||
|
|
||||||
|
for {
|
||||||
|
// Read a small number at a time in case there are many entries, we don't want to
|
||||||
|
// allocate a lot here.
|
||||||
|
ls, err := dirFD.Readdir(10)
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, entry := range ls {
|
||||||
|
if !entry.IsDir() {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// If the directory is not a number (i.e. not a PID), skip it
|
||||||
|
pid, err := strconv.Atoi(entry.Name())
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
cmdline, err := ioutil.ReadFile(filepath.Join("/proc", entry.Name(), "cmdline"))
|
||||||
|
if err != nil {
|
||||||
|
println("Error reading file %s: %+v", filepath.Join("/proc",
|
||||||
|
entry.Name(), "cmdline"), err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// The bytes we read have '\0' as a separator for the command line
|
||||||
|
parts := bytes.SplitN(cmdline, []byte{0}, 2)
|
||||||
|
if len(parts) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// Split the command line itself we are interested in just the first part
|
||||||
|
exe := strings.FieldsFunc(string(parts[0]), func(c rune) bool {
|
||||||
|
return unicode.IsSpace(c) || c == ':'
|
||||||
|
})
|
||||||
|
if len(exe) == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// Check if the name of the executable is what we are looking for
|
||||||
|
if re.MatchString(exe[0]) {
|
||||||
|
// Grab the PID from the directory path
|
||||||
|
pids = append(pids, pid)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return pids
|
||||||
|
}
|
108
srcs/common/ukconfig.go
Normal file
108
srcs/common/ukconfig.go
Normal file
|
@ -0,0 +1,108 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const CONFIGUK = "Config.uk"
|
||||||
|
const MENUCONFIG = "menuconfig"
|
||||||
|
const CONFIG = "config"
|
||||||
|
const SELECT = "select"
|
||||||
|
|
||||||
|
// processConfigUK processes each line of a "Config.uk" file.
|
||||||
|
//
|
||||||
|
// //
|
||||||
|
func ProcessConfigUK(lines []string, fullSelect bool,
|
||||||
|
mapConfig map[string][]string, mapLabel map[string]string) {
|
||||||
|
|
||||||
|
var libName string
|
||||||
|
var otherConfig = false
|
||||||
|
|
||||||
|
for i, line := range lines {
|
||||||
|
parseConfigUK(i, line, &libName, fullSelect, &otherConfig,
|
||||||
|
mapConfig, mapLabel)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseConfigUK parses each line of to detect selected libraries (dependencies)
|
||||||
|
//
|
||||||
|
// //
|
||||||
|
func parseConfigUK(index int, line string, libName *string, fullSelect bool,
|
||||||
|
otherConfig *bool, mapConfig map[string][]string, mapLabel map[string]string) {
|
||||||
|
|
||||||
|
space := regexp.MustCompile(`\s+|\t+`)
|
||||||
|
line = space.ReplaceAllString(line, " ")
|
||||||
|
line = strings.TrimSpace(line)
|
||||||
|
|
||||||
|
switch {
|
||||||
|
case strings.Contains(line, MENUCONFIG),
|
||||||
|
strings.Contains(line, CONFIG) && index == 0:
|
||||||
|
{
|
||||||
|
// First case: get the name of the lib
|
||||||
|
|
||||||
|
// Split the line to retrieve the name of the lib
|
||||||
|
split := strings.Split(line, " ")
|
||||||
|
if len(split) < 2 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
*libName = strings.TrimSuffix(split[1], "\n")
|
||||||
|
}
|
||||||
|
case strings.Contains(line, CONFIG) && index > 0:
|
||||||
|
{
|
||||||
|
// Second case: check if other Config lines
|
||||||
|
*otherConfig = true
|
||||||
|
}
|
||||||
|
case strings.Contains(line, SELECT) && index > 0:
|
||||||
|
{
|
||||||
|
// Third case: add select libs
|
||||||
|
|
||||||
|
// if there are other Config flag, check the dependencies if
|
||||||
|
// specified (fullDep), otherwise break
|
||||||
|
if !*otherConfig && !fullSelect {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
// Split the line to retrieve the name of the dependency
|
||||||
|
split := strings.Split(line, " ")
|
||||||
|
var library string
|
||||||
|
if len(split) < 2 {
|
||||||
|
break
|
||||||
|
} else if len(split) > 2 {
|
||||||
|
// If we have complex select (e.g., select LIBUKTIME if
|
||||||
|
// !HAVE_LIBC && ARCH_X86_64)
|
||||||
|
var re = regexp.MustCompile(`(?m)select\s(\w*)\sif\s([a-zA-Z0-9!_& ]*)`)
|
||||||
|
match := re.FindAllStringSubmatch(line, -1)
|
||||||
|
if len(match) > 0 {
|
||||||
|
library = match[0][1]
|
||||||
|
if mapLabel != nil {
|
||||||
|
mapLabel[library] = match[0][2]
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
library = split[1]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Current selected library
|
||||||
|
selectedLib := strings.TrimSuffix(library, "\n")
|
||||||
|
|
||||||
|
// Links between current lib and its dependencies
|
||||||
|
mapConfig[*libName] = append(mapConfig[*libName], selectedLib)
|
||||||
|
|
||||||
|
// Add selected lib in the map in order to generate a node
|
||||||
|
// if it does not exist
|
||||||
|
if _, ok := mapConfig[selectedLib]; !ok {
|
||||||
|
mapConfig[selectedLib] = nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
12
srcs/common/unikraft_path.go
Normal file
12
srcs/common/unikraft_path.go
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
package common
|
||||||
|
|
||||||
|
// Exported constants for folder management
|
||||||
|
const (
|
||||||
|
APPSFOLDER = "apps" + SEP
|
||||||
|
UNIKRAFTFOLDER = "unikraft" + SEP
|
||||||
|
BUILDFOLDER = "build" + SEP
|
||||||
|
LIBSFOLDER = "libs" + SEP
|
||||||
|
INCLUDEFOLDER = "include" + SEP
|
||||||
|
|
||||||
|
KVM_IMAGE = "_kvm-x86_64"
|
||||||
|
)
|
199
srcs/common/utils.go
Normal file
199
srcs/common/utils.go
Normal file
|
@ -0,0 +1,199 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"io/ioutil"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"reflect"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Contains checks if a given slice contains a particular string.
|
||||||
|
//
|
||||||
|
// It returns true if the given contains the searched string.
|
||||||
|
func Contains(s []string, e string) bool {
|
||||||
|
for _, a := range s {
|
||||||
|
if a == e {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// DownloadFile downloads a file from an URL and reads its content.
|
||||||
|
//
|
||||||
|
// It returns a pointer to a string that represents the file content and an
|
||||||
|
// error if any, otherwise it returns nil.
|
||||||
|
func DownloadFile(url string) (*string, error) {
|
||||||
|
|
||||||
|
resp, err := http.Get(url)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
var bodyString string
|
||||||
|
if resp.StatusCode == http.StatusOK {
|
||||||
|
bodyBytes, err := ioutil.ReadAll(resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
bodyString = string(bodyBytes)
|
||||||
|
}
|
||||||
|
return &bodyString, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetProgramPath returns the absolute path of a given program.
|
||||||
|
//
|
||||||
|
// It returns a string that represents the absolute path of a program and an
|
||||||
|
// error if any, otherwise it returns nil.
|
||||||
|
func GetProgramPath(programName *string) (string, error) {
|
||||||
|
var programPath string
|
||||||
|
if ok, err := Exists(*programName); err != nil {
|
||||||
|
return programPath, err
|
||||||
|
} else if ok {
|
||||||
|
// Program (binary) is installed
|
||||||
|
if filepath.IsAbs(*programName) {
|
||||||
|
programPath = *programName
|
||||||
|
*programName = filepath.Base(programPath)
|
||||||
|
} else if programPath, err = filepath.Abs(*programName); err != nil {
|
||||||
|
return programPath, err
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Run 'which' command to determine if program has a symbolic name
|
||||||
|
out, err := ExecuteCommand("which", []string{*programName})
|
||||||
|
if err != nil {
|
||||||
|
return programPath, err
|
||||||
|
} else {
|
||||||
|
// Check if out is a valid path
|
||||||
|
if _, err := os.Stat(out); err == nil {
|
||||||
|
PrintWarning("Unknown Program -> will skip gathering" +
|
||||||
|
" symbols, system calls and shared libs process")
|
||||||
|
} else {
|
||||||
|
programPath = strings.TrimSpace(out)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return programPath, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------Record Data---------------------------------
|
||||||
|
|
||||||
|
// RecordDataTxt saves data into a text file named by filename.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func RecordDataTxt(filename string, headers []string, data interface{}) error {
|
||||||
|
file, err := os.Create(filename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
v := reflect.ValueOf(data)
|
||||||
|
values := make([]interface{}, v.NumField())
|
||||||
|
|
||||||
|
for i := 0; i < v.NumField(); i++ {
|
||||||
|
values[i] = v.Field(i).Interface()
|
||||||
|
if err := WriteMapToFile(file, headers[i], values[i]); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// WriteMapToFile saves map into a text file named by filename.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func WriteMapToFile(file *os.File, headerName string, in interface{}) error {
|
||||||
|
header := "----------------------------------------------\n" +
|
||||||
|
headerName + "\n----------------------------------------------\n"
|
||||||
|
|
||||||
|
if _, err := file.WriteString(header); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
switch v := in.(type) {
|
||||||
|
case map[string]string:
|
||||||
|
for key, value := range v {
|
||||||
|
|
||||||
|
var str string
|
||||||
|
if len(value) > 0 {
|
||||||
|
str = key + "@" + value
|
||||||
|
} else {
|
||||||
|
str = key
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := file.WriteString(str + "\n"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case map[string][]string:
|
||||||
|
for key, values := range v {
|
||||||
|
|
||||||
|
var str string
|
||||||
|
if len(values) > 0 {
|
||||||
|
str = key + "->" + strings.Join(values, ",")
|
||||||
|
} else {
|
||||||
|
str = key
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := file.WriteString(str + "\n"); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// -------------------------------------JSON------------------------------------
|
||||||
|
|
||||||
|
// RecordDataJson saves json into a json file named by filename.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func RecordDataJson(filename string, data *Data) error {
|
||||||
|
|
||||||
|
b, err := json.Marshal(data)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err = ioutil.WriteFile(filename+".json", b, os.ModePerm); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReadDataJson load json from a json file named by filename.
|
||||||
|
//
|
||||||
|
// It returns a Data structure initialized and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func ReadDataJson(filename string, data *Data) (*Data, error) {
|
||||||
|
|
||||||
|
jsonFile, err := os.Open(filename + ".json")
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer jsonFile.Close()
|
||||||
|
|
||||||
|
byteValue, err := ioutil.ReadAll(jsonFile)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err = json.Unmarshal(byteValue, &data); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return data, nil
|
||||||
|
}
|
41
srcs/crawlertool/args.go
Normal file
41
srcs/crawlertool/args.go
Normal file
|
@ -0,0 +1,41 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package crawlertool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/akamensky/argparse"
|
||||||
|
"os"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
fullLibsArg = "full"
|
||||||
|
outputArg = "output"
|
||||||
|
libsArg = "libraries"
|
||||||
|
repoArg = "repository"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ParseArguments parses arguments of the application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func parseLocalArguments(p *argparse.Parser, args *u.Arguments) error {
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.BOOL, "f", fullLibsArg,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Take all the selected libraries"})
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "o", outputArg,
|
||||||
|
&argparse.Options{Required: true, Help: "Output folder that will " +
|
||||||
|
"contain dependencies graph and images"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "l", libsArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the file that " +
|
||||||
|
"contains libs"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "r", repoArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the repository"})
|
||||||
|
|
||||||
|
return u.ParserWrapper(p, os.Args)
|
||||||
|
}
|
111
srcs/crawlertool/run_crawlertool.go
Normal file
111
srcs/crawlertool/run_crawlertool.go
Normal file
|
@ -0,0 +1,111 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package crawlertool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// RunCrawler allows to run the crawler analyser tool (which is out of the
|
||||||
|
// UNICORE toolchain).
|
||||||
|
func RunCrawler() {
|
||||||
|
|
||||||
|
mapLabel := make(map[string]string)
|
||||||
|
mapConfig := make(map[string][]string)
|
||||||
|
|
||||||
|
// Init and parse local arguments
|
||||||
|
args := new(u.Arguments)
|
||||||
|
p, err := args.InitArguments()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
if err := parseLocalArguments(p, args); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Used to select all libraries (even those below another Config fields)
|
||||||
|
fullSelect := *args.BoolArg[fullLibsArg]
|
||||||
|
|
||||||
|
var path string
|
||||||
|
if len(*args.StringArg[repoArg]) > 0 {
|
||||||
|
// Only one folder
|
||||||
|
path = *args.StringArg[repoArg]
|
||||||
|
u.PrintInfo("Parse folder: " + path)
|
||||||
|
if err := searchConfigUK(path, fullSelect, mapConfig, mapLabel); err != nil {
|
||||||
|
u.PrintErr()
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if len(*args.StringArg[libsArg]) > 0 {
|
||||||
|
|
||||||
|
// Several folders within a list
|
||||||
|
lines, err := u.ReadLinesFile(*args.StringArg[libsArg])
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process Config.uk of each process
|
||||||
|
for _, line := range lines {
|
||||||
|
path = strings.TrimSuffix(line, "\n")
|
||||||
|
u.PrintInfo("Parse folder: " + path)
|
||||||
|
if err := searchConfigUK(path, fullSelect, mapConfig, mapLabel); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
u.PrintErr("You must specify either -r (--repository) or -l (libs)")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate the out folder
|
||||||
|
outFolder := *args.StringArg[outputArg]
|
||||||
|
if outFolder[len(outFolder)-1:] != string(os.PathSeparator) {
|
||||||
|
outFolder += string(os.PathSeparator)
|
||||||
|
}
|
||||||
|
|
||||||
|
outputPath := outFolder +
|
||||||
|
"output_" + time.Now().Format("20060102150405")
|
||||||
|
|
||||||
|
// Create the dependencies graph
|
||||||
|
u.GenerateGraph("Unikraft Crawler", outputPath, mapConfig,
|
||||||
|
mapLabel)
|
||||||
|
|
||||||
|
u.PrintOk(".dot file is saved: " + outputPath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// searchConfigUK performs a look-up to find "Config.uk" files.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func searchConfigUK(path string, fullSelect bool,
|
||||||
|
mapConfig map[string][]string, mapLabel map[string]string) error {
|
||||||
|
|
||||||
|
err := filepath.Walk(path, func(path string, info os.FileInfo,
|
||||||
|
err error) error {
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Consider only CONFIGUK files
|
||||||
|
if !info.IsDir() && info.Name() == u.CONFIGUK {
|
||||||
|
lines, err := u.ReadLinesFile(path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
u.ProcessConfigUK(lines, fullSelect, mapConfig, mapLabel)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
50
srcs/dependtool/args.go
Normal file
50
srcs/dependtool/args.go
Normal file
|
@ -0,0 +1,50 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/akamensky/argparse"
|
||||||
|
"os"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
programArg = "program"
|
||||||
|
testFileArg = "testFile"
|
||||||
|
configFileArg = "configFile"
|
||||||
|
optionsArg = "options"
|
||||||
|
waitTimeArg = "waitTime"
|
||||||
|
saveOutputArg = "saveOutput"
|
||||||
|
fullDepsArg = "fullDeps"
|
||||||
|
)
|
||||||
|
|
||||||
|
// parseLocalArguments parses arguments of the application.
|
||||||
|
func parseLocalArguments(p *argparse.Parser, args *u.Arguments) error {
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "p", programArg,
|
||||||
|
&argparse.Options{Required: true, Help: "Program name"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "t", testFileArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the test file"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "c", configFileArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the config file"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "o", optionsArg,
|
||||||
|
&argparse.Options{Required: false, Default: "", Help: "Extra options for " +
|
||||||
|
"launching program"})
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.INT, "w", waitTimeArg,
|
||||||
|
&argparse.Options{Required: false, Default: 60, Help: "Time wait (" +
|
||||||
|
"sec) for external tests (default: 60 sec)"})
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.BOOL, "", saveOutputArg,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Save results as TXT file and graphs as PNG file"})
|
||||||
|
args.InitArgParse(p, args, u.BOOL, "", fullDepsArg,
|
||||||
|
&argparse.Options{Required: false, Default: false,
|
||||||
|
Help: "Show dependencies of dependencies"})
|
||||||
|
|
||||||
|
return u.ParserWrapper(p, os.Args)
|
||||||
|
}
|
97
srcs/dependtool/binary_file.go
Normal file
97
srcs/dependtool/binary_file.go
Normal file
|
@ -0,0 +1,97 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"debug/elf"
|
||||||
|
"errors"
|
||||||
|
"os"
|
||||||
|
)
|
||||||
|
|
||||||
|
// getMachOS reads and decodes a Mac os file.
|
||||||
|
//
|
||||||
|
// It returns an error if any otherwise it returns nil.
|
||||||
|
func getMachOS(filename string) error {
|
||||||
|
f, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
// Read and decode MachOS identifier
|
||||||
|
var ident [16]uint8
|
||||||
|
if _, err = f.ReadAt(ident[0:], 0); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if ident[0] != '\xca' || ident[1] != '\xfe' || ident[2] != '\xba' || ident[3] != '\xbe' {
|
||||||
|
return nil
|
||||||
|
} else if ident[0] != '\xcf' || ident[1] != '\xfa' || ident[2] != '\xed' || ident[3] != '\xfe' {
|
||||||
|
return nil
|
||||||
|
} else if ident[0] != '\xfe' || ident[1] != '\xed' || ident[2] != '\xfa' || ident[3] != '\xcf' {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return errors.New("not compatible machos format")
|
||||||
|
}
|
||||||
|
|
||||||
|
// getElf reads and decodes an ELF file.
|
||||||
|
//
|
||||||
|
// It returns a pointer to an ELF file and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func getElf(filename string) (*elf.File, error) {
|
||||||
|
f, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
_elf, err := elf.NewFile(f)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read and decode ELF identifier
|
||||||
|
var ident [16]uint8
|
||||||
|
_, err = f.ReadAt(ident[0:], 0)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check the type
|
||||||
|
if ident[0] != '\x7f' || ident[1] != 'E' || ident[2] != 'L' || ident[3] != 'F' {
|
||||||
|
return nil, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return _elf, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetElfArchitecture gets the ELF architecture.
|
||||||
|
//
|
||||||
|
// It returns a string that defines the ELF class and a string that defines the
|
||||||
|
// Machine type.
|
||||||
|
func GetElfArchitecture(elf *elf.File) (string, string) {
|
||||||
|
var arch, mach string
|
||||||
|
|
||||||
|
switch elf.Class.String() {
|
||||||
|
case "ELFCLASS64":
|
||||||
|
arch = "64 bits"
|
||||||
|
case "ELFCLASS32":
|
||||||
|
arch = "32 bits"
|
||||||
|
}
|
||||||
|
|
||||||
|
switch elf.Machine.String() {
|
||||||
|
case "EM_AARCH64":
|
||||||
|
mach = "ARM64"
|
||||||
|
case "EM_386":
|
||||||
|
mach = "x86"
|
||||||
|
case "EM_X86_64":
|
||||||
|
mach = "x86_64"
|
||||||
|
}
|
||||||
|
|
||||||
|
return arch, mach
|
||||||
|
}
|
225
srcs/dependtool/dynamic_analyser.go
Normal file
225
srcs/dependtool/dynamic_analyser.go
Normal file
|
@ -0,0 +1,225 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"errors"
|
||||||
|
"io/ioutil"
|
||||||
|
"os"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"syscall"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Exported struct that represents the arguments for dynamic analysis.
|
||||||
|
type DynamicArgs struct {
|
||||||
|
waitTime int
|
||||||
|
fullDeps, saveOutput bool
|
||||||
|
testFile string
|
||||||
|
options []string
|
||||||
|
}
|
||||||
|
|
||||||
|
const (
|
||||||
|
systrace = "strace"
|
||||||
|
libtrace = "ltrace"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ---------------------------------Read Json-----------------------------------
|
||||||
|
|
||||||
|
// readTestFileJson load Testing json from a json file named by filename.
|
||||||
|
//
|
||||||
|
// It returns a Testing structure initialized and an error if any, otherwise it
|
||||||
|
// returns nil.
|
||||||
|
func readTestFileJson(filename string) (*Testing, error) {
|
||||||
|
|
||||||
|
testingStruct := &Testing{}
|
||||||
|
jsonFile, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer jsonFile.Close()
|
||||||
|
|
||||||
|
byteValue, err := ioutil.ReadAll(jsonFile)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if err = json.Unmarshal(byteValue, &testingStruct); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if testingStruct.ListCommands == nil || len(testingStruct.ListCommands) == 0 {
|
||||||
|
return nil, errors.New("ListCommands cannot be empty")
|
||||||
|
}
|
||||||
|
|
||||||
|
return testingStruct, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// --------------------------------Gather Data----------------------------------
|
||||||
|
|
||||||
|
// gatherDataAux gathers symbols and system calls of a given application (helper
|
||||||
|
// function.
|
||||||
|
//
|
||||||
|
// It returns true if a command must be run with sudo.
|
||||||
|
func gatherDataAux(command, programPath, programName, option string,
|
||||||
|
data *u.DynamicData, dArgs DynamicArgs) bool {
|
||||||
|
|
||||||
|
testingStruct := &Testing{}
|
||||||
|
if len(dArgs.testFile) > 0 {
|
||||||
|
var err error
|
||||||
|
testingStruct, err = readTestFileJson(dArgs.testFile)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintWarning("Cannot find test file: " + err.Error())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_, errStr := runCommandTester(programPath, programName, command, option,
|
||||||
|
testingStruct, dArgs, data)
|
||||||
|
|
||||||
|
ret := false
|
||||||
|
if command == systrace {
|
||||||
|
ret = parseTrace(errStr, data.SystemCalls)
|
||||||
|
} else {
|
||||||
|
ret = parseTrace(errStr, data.Symbols)
|
||||||
|
}
|
||||||
|
return ret
|
||||||
|
}
|
||||||
|
|
||||||
|
// gatherData gathers symbols and system calls of a given application.
|
||||||
|
//
|
||||||
|
func gatherData(command, programPath, programName string,
|
||||||
|
data *u.DynamicData, dArgs DynamicArgs) {
|
||||||
|
|
||||||
|
if len(dArgs.options) > 0 {
|
||||||
|
// Iterate through configs present in config file
|
||||||
|
for _, option := range dArgs.options {
|
||||||
|
// Check if program name is used in config file
|
||||||
|
if strings.Contains(option, programName) {
|
||||||
|
// If yes, take only arguments
|
||||||
|
split := strings.Split(option, programName)
|
||||||
|
option = strings.TrimSuffix(strings.Replace(split[1],
|
||||||
|
" ", "", -1), "\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
u.PrintInfo("Run " + programName + " with option: '" +
|
||||||
|
option + "'")
|
||||||
|
if requireSudo := gatherDataAux(command, programPath, programName,
|
||||||
|
option, data, dArgs); requireSudo {
|
||||||
|
u.PrintErr(programName + " requires superuser " +
|
||||||
|
"privileges: Run command with sudo")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Run without option/config
|
||||||
|
if requireSudo := gatherDataAux(command, programPath, programName,
|
||||||
|
"", data, dArgs); requireSudo {
|
||||||
|
u.PrintErr(programName + " requires superuser " +
|
||||||
|
"privileges: Run command with sudo")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// gatherDynamicSharedLibs gathers shared libraries of a given application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func gatherDynamicSharedLibs(programName string, pid int, data *u.DynamicData,
|
||||||
|
fullDeps bool) error {
|
||||||
|
|
||||||
|
// Get the pid
|
||||||
|
pidStr := strconv.Itoa(pid)
|
||||||
|
u.PrintInfo("PID '" + programName + "' : " + pidStr)
|
||||||
|
|
||||||
|
// Use 'lsof' to get open files and thus .so files
|
||||||
|
if output, err := u.ExecutePipeCommand(
|
||||||
|
"lsof -p " + pidStr + " | uniq | awk '{print $9}'"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
// Parse 'lsof' output
|
||||||
|
if err := parseLsof(output, data, fullDeps); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use 'cat /proc/pid' to get open files and thus .so files
|
||||||
|
if output, err := u.ExecutePipeCommand(
|
||||||
|
"cat /proc/" + pidStr + "/maps | awk '{print $6}' | " +
|
||||||
|
"grep '\\.so' | sort | uniq"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
// Parse 'cat' output
|
||||||
|
if err := parseLsof(output, data, fullDeps); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ------------------------------------ARGS-------------------------------------
|
||||||
|
|
||||||
|
// getDArgs returns a DynamicArgs struct which contains arguments specific to
|
||||||
|
// the dynamic dependency analysis.
|
||||||
|
//
|
||||||
|
// It returns a DynamicArgs struct.
|
||||||
|
func getDArgs(args *u.Arguments, options []string) DynamicArgs {
|
||||||
|
return DynamicArgs{*args.IntArg[waitTimeArg],
|
||||||
|
*args.BoolArg[fullDepsArg], *args.BoolArg[saveOutputArg],
|
||||||
|
*args.StringArg[testFileArg], options}
|
||||||
|
}
|
||||||
|
|
||||||
|
// -------------------------------------RUN-------------------------------------
|
||||||
|
|
||||||
|
// RunDynamicAnalyser runs the dynamic analysis to get shared libraries,
|
||||||
|
// system calls and library calls of a given application.
|
||||||
|
//
|
||||||
|
func dynamicAnalyser(args *u.Arguments, data *u.Data, programPath string) {
|
||||||
|
|
||||||
|
// Check options
|
||||||
|
var configs []string
|
||||||
|
if len(*args.StringArg[configFileArg]) > 0 {
|
||||||
|
// Multi-lines options (config)
|
||||||
|
var err error
|
||||||
|
configs, err = u.ReadLinesFile(*args.StringArg[configFileArg])
|
||||||
|
if err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
} else if len(*args.StringArg[optionsArg]) > 0 {
|
||||||
|
// Single option (config)
|
||||||
|
configs = append(configs, *args.StringArg[optionsArg])
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get dynamic structure
|
||||||
|
dArgs := getDArgs(args, configs)
|
||||||
|
programName := *args.StringArg[programArg]
|
||||||
|
|
||||||
|
// Kill process if it is already launched
|
||||||
|
u.PrintInfo("Kill '" + programName + "' if it is already launched")
|
||||||
|
if err := u.PKill(programName, syscall.SIGINT); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Init dynamic data
|
||||||
|
dynamicData := &data.DynamicData
|
||||||
|
dynamicData.SharedLibs = make(map[string][]string)
|
||||||
|
dynamicData.SystemCalls = make(map[string]string)
|
||||||
|
dynamicData.Symbols = make(map[string]string)
|
||||||
|
|
||||||
|
// Run strace
|
||||||
|
u.PrintHeader2("(*) Gathering system calls from ELF file")
|
||||||
|
gatherData(systrace, programPath, programName, dynamicData, dArgs)
|
||||||
|
|
||||||
|
// Kill process if it is already launched
|
||||||
|
u.PrintInfo("Kill '" + programName + "' if it is already launched")
|
||||||
|
if err := u.PKill(programName, syscall.SIGINT); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run ltrace
|
||||||
|
u.PrintHeader2("(*) Gathering symbols from ELF file")
|
||||||
|
gatherData(libtrace, programPath, programName, dynamicData, dArgs)
|
||||||
|
}
|
314
srcs/dependtool/parser.go
Normal file
314
srcs/dependtool/parser.go
Normal file
|
@ -0,0 +1,314 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const levelDeps = 5
|
||||||
|
|
||||||
|
type recursiveData struct {
|
||||||
|
data, glMap, printDep map[string][]string
|
||||||
|
cmd, line string
|
||||||
|
listStr []string
|
||||||
|
level int
|
||||||
|
}
|
||||||
|
|
||||||
|
// --------------------------------Static Output--------------------------------
|
||||||
|
|
||||||
|
// parseReadELF parses the output of the 'readelf' command.
|
||||||
|
//
|
||||||
|
func parseReadELF(output string, data *u.StaticData) {
|
||||||
|
types := map[string]bool{"FUNC": true, "FILE": true, "OBJECT": true}
|
||||||
|
|
||||||
|
// Check the output of 'readelf' command
|
||||||
|
for _, line := range strings.Split(output, "\n") {
|
||||||
|
words := strings.Fields(line)
|
||||||
|
|
||||||
|
if len(words) > 8 && types[words[3]] {
|
||||||
|
symbol := strings.Split(words[7], "@")
|
||||||
|
data.Symbols[symbol[0]] = symbol[1]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseNMMacos parses the output of the 'nm' command (Mac os).
|
||||||
|
//
|
||||||
|
func parseNMMac(output string, data *u.StaticData) {
|
||||||
|
// Get the list of system calls
|
||||||
|
systemCalls := initSystemCalls()
|
||||||
|
|
||||||
|
// Check the output of 'nm' command
|
||||||
|
var re = regexp.MustCompile(`(?m)([U|T|B|D]\s)(.*)\s*`)
|
||||||
|
for _, match := range re.FindAllStringSubmatch(output, -1) {
|
||||||
|
if len(match) > 2 {
|
||||||
|
if match[2][0] == '_' {
|
||||||
|
match[2] = match[2][1:]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add to system calls map if symbol is a system call
|
||||||
|
if _, isSyscall := systemCalls[match[2]]; isSyscall {
|
||||||
|
data.SystemCalls[match[2]] = ""
|
||||||
|
} else {
|
||||||
|
data.Symbols[match[2]] = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseNMLinux parses the output of the 'nm' command (Linux).
|
||||||
|
//
|
||||||
|
func parseNMLinux(output string, data *u.StaticData) {
|
||||||
|
// Get the list of system calls
|
||||||
|
systemCalls := initSystemCalls()
|
||||||
|
|
||||||
|
// Check the output of 'nm' command
|
||||||
|
var re = regexp.MustCompile(`(?m)([U|T|B|D]\s)(.*)\s*`)
|
||||||
|
for _, match := range re.FindAllStringSubmatch(output, -1) {
|
||||||
|
// Add to system calls map if symbol is a system call
|
||||||
|
if _, isSyscall := systemCalls[match[2]]; isSyscall {
|
||||||
|
data.SystemCalls[match[2]] = ""
|
||||||
|
} else {
|
||||||
|
data.Symbols[match[2]] = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// parsePackagesName parses the output of the 'apt-cache pkgnames' command.
|
||||||
|
//
|
||||||
|
// It returns a string which represents the name of application used by the
|
||||||
|
// package manager (apt, ...).
|
||||||
|
func parsePackagesName(output string) string {
|
||||||
|
|
||||||
|
var i = 1
|
||||||
|
lines := strings.Split(output, "\n")
|
||||||
|
for _, line := range lines {
|
||||||
|
if len(line) > 0 {
|
||||||
|
fmt.Printf("%d) %s\n", i, line)
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var input int
|
||||||
|
for true {
|
||||||
|
fmt.Print("Please enter your choice (0 to exit): ")
|
||||||
|
if _, err := fmt.Scanf("%d", &input); err != nil {
|
||||||
|
u.PrintWarning("Choice must be numeric! Try again")
|
||||||
|
} else {
|
||||||
|
if input == 0 {
|
||||||
|
u.PrintWarning("Abort dependencies analysis from apt-cache")
|
||||||
|
return ""
|
||||||
|
} else if (input >= 0) && (input <= i) {
|
||||||
|
return lines[input-1]
|
||||||
|
} else {
|
||||||
|
u.PrintWarning("Invalid input! Try again")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseDependencies parses the output of the 'apt-cache depends' command.
|
||||||
|
//
|
||||||
|
// It returns a slice of strings which represents all the dependencies of
|
||||||
|
// a particular package.
|
||||||
|
func parseDependencies(output string, data, dependenciesMap,
|
||||||
|
printDep map[string][]string, fullDeps bool, level int) []string {
|
||||||
|
listDep := make([]string, 0)
|
||||||
|
for _, line := range strings.Split(output, "\n") {
|
||||||
|
if len(line) > 0 && !strings.Contains(line, "<") {
|
||||||
|
|
||||||
|
if _, in := printDep[line]; !in {
|
||||||
|
fmt.Println(line)
|
||||||
|
printDep[line] = nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if fullDeps && level < levelDeps {
|
||||||
|
rd := recursiveData{
|
||||||
|
data: data,
|
||||||
|
glMap: dependenciesMap,
|
||||||
|
printDep: printDep,
|
||||||
|
cmd: "apt-cache depends " + line +
|
||||||
|
" | awk '/Depends/ { print $2 }'",
|
||||||
|
line: line,
|
||||||
|
listStr: listDep,
|
||||||
|
level: level,
|
||||||
|
}
|
||||||
|
listDep = append(listDep, line)
|
||||||
|
parseRecursive(rd)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
data[line] = nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return listDep
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseLDD parses the output of the 'ldd' command.
|
||||||
|
//
|
||||||
|
// It returns a slice of strings which represents all the shared libs of
|
||||||
|
// a particular package.
|
||||||
|
func parseLDDMac(output string, data map[string][]string, lddMap map[string][]string,
|
||||||
|
fullDeps bool) []string {
|
||||||
|
|
||||||
|
listLdd := make([]string, 0)
|
||||||
|
lines := strings.Split(output, "\n")
|
||||||
|
// Remove first element
|
||||||
|
lines = lines[1:]
|
||||||
|
|
||||||
|
for _, line := range lines {
|
||||||
|
|
||||||
|
// Execute ldd only if fullDeps mode is set
|
||||||
|
if fullDeps {
|
||||||
|
rd := recursiveData{
|
||||||
|
data: data,
|
||||||
|
glMap: lddMap,
|
||||||
|
printDep: nil,
|
||||||
|
cmd: "otool -L " + line + " | awk '{ print $1 }'",
|
||||||
|
line: line,
|
||||||
|
listStr: listLdd,
|
||||||
|
level: -1,
|
||||||
|
}
|
||||||
|
listLdd = append(listLdd, line)
|
||||||
|
parseRecursive(rd)
|
||||||
|
} else {
|
||||||
|
data[line] = nil
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
return listLdd
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseLDD parses the output of the 'ldd' command.
|
||||||
|
//
|
||||||
|
// It returns a slice of strings which represents all the shared libs of
|
||||||
|
// a particular package.
|
||||||
|
func parseLDD(output string, data map[string][]string, lddMap map[string][]string,
|
||||||
|
fullDeps bool) []string {
|
||||||
|
|
||||||
|
listLdd := make([]string, 0)
|
||||||
|
for _, line := range strings.Split(output, "\n") {
|
||||||
|
words := strings.Fields(line)
|
||||||
|
|
||||||
|
if len(words) == 2 {
|
||||||
|
|
||||||
|
lib, path := words[0], words[1]
|
||||||
|
|
||||||
|
// Execute ldd only if fullDeps mode is set
|
||||||
|
if fullDeps {
|
||||||
|
rd := recursiveData{
|
||||||
|
data: data,
|
||||||
|
glMap: lddMap,
|
||||||
|
printDep: nil,
|
||||||
|
cmd: "ldd " + path + " | awk '/ => / { print $1,$3 }'",
|
||||||
|
line: lib,
|
||||||
|
listStr: listLdd,
|
||||||
|
level: -1,
|
||||||
|
}
|
||||||
|
listLdd = append(listLdd, lib)
|
||||||
|
parseRecursive(rd)
|
||||||
|
} else {
|
||||||
|
data[lib] = nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return listLdd
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseRecursive is used by parseDependencies and parseLDD to factorize code.
|
||||||
|
//
|
||||||
|
func parseRecursive(rD recursiveData) {
|
||||||
|
|
||||||
|
if _, in := rD.glMap[rD.line]; in {
|
||||||
|
// Use additional map to avoid executing again ldd
|
||||||
|
rD.data[rD.line] = rD.glMap[rD.line]
|
||||||
|
} else {
|
||||||
|
|
||||||
|
var libsAcc []string
|
||||||
|
out, err := u.ExecutePipeCommand(rD.cmd)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if rD.printDep == nil {
|
||||||
|
libsAcc = parseLDD(out, rD.data, rD.glMap, true)
|
||||||
|
} else {
|
||||||
|
libsAcc = parseDependencies(out, rD.data, rD.glMap, rD.printDep,
|
||||||
|
true, rD.level+1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add return libsAcc to map
|
||||||
|
rD.data[rD.line] = libsAcc
|
||||||
|
rD.glMap[rD.line] = libsAcc
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ------------------------------Dynamic Output --------------------------------
|
||||||
|
|
||||||
|
// detectPermissionDenied detects if "Permission denied" substring is
|
||||||
|
// present within dynamic analysis output.
|
||||||
|
//
|
||||||
|
// It returns true if it "Permission denied" is present, otherwise false.
|
||||||
|
func detectPermissionDenied(str string) bool {
|
||||||
|
if strings.Contains(str, "EACCESS (Permission denied)") ||
|
||||||
|
strings.Contains(str, "13: Permission denied") {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseTrace parses the output of the '(s)|(f)trace' command.
|
||||||
|
//
|
||||||
|
// It returns true if command must be run with sudo, otherwise false.
|
||||||
|
func parseTrace(output string, data map[string]string) bool {
|
||||||
|
|
||||||
|
var re = regexp.MustCompile(`([a-zA-Z_0-9@/-]+?)\((.*)`)
|
||||||
|
for _, match := range re.FindAllStringSubmatch(output, -1) {
|
||||||
|
if len(match) > 1 {
|
||||||
|
// Detect if Permission denied is thrown
|
||||||
|
detected := detectPermissionDenied(match[2])
|
||||||
|
if detected {
|
||||||
|
// Command must be run with sudo
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
// Add symbol to map
|
||||||
|
data[match[1]] = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseLsof parses the output of the 'lsof' command.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func parseLsof(output string, data *u.DynamicData, fullDeps bool) error {
|
||||||
|
|
||||||
|
lddMap := make(map[string][]string)
|
||||||
|
for _, line := range strings.Split(output, "\n") {
|
||||||
|
if strings.Contains(line, ".so") {
|
||||||
|
words := strings.Split(line, "/")
|
||||||
|
data.SharedLibs[words[len(words)-1]] = nil
|
||||||
|
if fullDeps {
|
||||||
|
// Execute ldd only if fullDeps mode is set
|
||||||
|
if out, err := u.ExecutePipeCommand("ldd " + line +
|
||||||
|
" | awk '/ => / { print $1,$3 }'"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
data.SharedLibs[words[len(words)-1]] =
|
||||||
|
parseLDD(out, data.SharedLibs, lddMap, fullDeps)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
199
srcs/dependtool/run_deptool.go
Normal file
199
srcs/dependtool/run_deptool.go
Normal file
|
@ -0,0 +1,199 @@
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"github.com/fatih/color"
|
||||||
|
"runtime"
|
||||||
|
"strings"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// RunAnalyserTool allows to run the dependency analyser tool.
|
||||||
|
func RunAnalyserTool(homeDir string, data *u.Data) {
|
||||||
|
|
||||||
|
// Support only Unix
|
||||||
|
if strings.ToLower(runtime.GOOS) == "windows" {
|
||||||
|
u.PrintErr("Windows platform is not supported")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Init and parse local arguments
|
||||||
|
args := new(u.Arguments)
|
||||||
|
p, err := args.InitArguments()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
if err := parseLocalArguments(p, args); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get program path
|
||||||
|
programPath, err := u.GetProgramPath(&*args.StringArg[programArg])
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr("Could not determine program path", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get program Name
|
||||||
|
programName := *args.StringArg[programArg]
|
||||||
|
|
||||||
|
// Create the folder 'output' if it does not exist
|
||||||
|
outFolder := homeDir + u.SEP + programName + "_" + u.OUTFOLDER
|
||||||
|
if _, err := u.CreateFolder(outFolder); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Display Minor Details
|
||||||
|
displayProgramDetails(programName, programPath, args)
|
||||||
|
|
||||||
|
// Check if the program is a binary
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
checkElf(&programPath)
|
||||||
|
} else if strings.ToLower(runtime.GOOS) == "darwin" {
|
||||||
|
checkMachOS(&programPath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run static analyser
|
||||||
|
u.PrintHeader1("(1.1) RUN STATIC ANALYSIS")
|
||||||
|
runStaticAnalyser(args, programName, programPath, outFolder, data)
|
||||||
|
|
||||||
|
// Run dynamic analyser
|
||||||
|
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
u.PrintHeader1("(1.2) RUN DYNAMIC ANALYSIS")
|
||||||
|
runDynamicAnalyser(args, programName, programPath, outFolder, data)
|
||||||
|
} else {
|
||||||
|
// dtruss/dtrace on mac needs to disable system integrity protection
|
||||||
|
u.PrintWarning("Dynamic analysis is not supported on mac")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save Data to JSON
|
||||||
|
if err = u.RecordDataJson(outFolder+programName, data); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
} else {
|
||||||
|
u.PrintOk("JSON Data saved into " + outFolder + programName +
|
||||||
|
".json")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save graph if full dependencies option is set
|
||||||
|
if *args.BoolArg[fullDepsArg] {
|
||||||
|
saveGraph(programName, outFolder, data)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// displayProgramDetails display various information such path, background, ...
|
||||||
|
func displayProgramDetails(programName, programPath string, args *u.Arguments) {
|
||||||
|
fmt.Println("----------------------------------------------")
|
||||||
|
fmt.Println("Analyze Program: ", color.GreenString(programName))
|
||||||
|
fmt.Println("Full Path: ", color.GreenString(programPath))
|
||||||
|
if len(*args.StringArg[optionsArg]) > 0 {
|
||||||
|
fmt.Println("Options: ", color.GreenString(*args.StringArg[optionsArg]))
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(*args.StringArg[configFileArg]) > 0 {
|
||||||
|
fmt.Println("Config file: ", color.GreenString(*args.StringArg[configFileArg]))
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(*args.StringArg[testFileArg]) > 0 {
|
||||||
|
fmt.Println("Test file: ", color.GreenString(*args.StringArg[testFileArg]))
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Println("----------------------------------------------")
|
||||||
|
}
|
||||||
|
|
||||||
|
// checkMachOS checks if the program (from its path) is an Mach os file
|
||||||
|
func checkMachOS(programPath *string) {
|
||||||
|
if err := getMachOS(*programPath); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// checkElf checks if the program (from its path) is an ELF file
|
||||||
|
func checkElf(programPath *string) {
|
||||||
|
elfFile, err := getElf(*programPath)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
} else if elfFile == nil {
|
||||||
|
*programPath = ""
|
||||||
|
u.PrintWarning("Only ELF binaries are supported! Some analysis" +
|
||||||
|
" procedures will be skipped")
|
||||||
|
} else {
|
||||||
|
// Get ELF architecture
|
||||||
|
architecture, machine := GetElfArchitecture(elfFile)
|
||||||
|
fmt.Println("ELF Class: ", architecture)
|
||||||
|
fmt.Println("Machine: ", machine)
|
||||||
|
fmt.Println("Entry Point: ", elfFile.Entry)
|
||||||
|
fmt.Println("----------------------------------------------")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// runStaticAnalyser runs the static analyser
|
||||||
|
func runStaticAnalyser(args *u.Arguments, programName, programPath,
|
||||||
|
outFolder string, data *u.Data) {
|
||||||
|
|
||||||
|
staticAnalyser(*args, data, programPath)
|
||||||
|
|
||||||
|
// Save static Data into text file if display mode is set
|
||||||
|
if *args.BoolArg[saveOutputArg] {
|
||||||
|
|
||||||
|
// Create the folder 'output/static' if it does not exist
|
||||||
|
outFolderStatic := outFolder + "static" + u.SEP
|
||||||
|
if _, err := u.CreateFolder(outFolderStatic); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn := outFolderStatic + programName + ".txt"
|
||||||
|
headersStr := []string{"Dependencies (from apt-cache show) list:",
|
||||||
|
"Shared libraries list:", "System calls list:", "Symbols list:"}
|
||||||
|
|
||||||
|
if err := u.RecordDataTxt(fn, headersStr, data.StaticData); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
u.PrintOk("Data saved into " + fn)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// runDynamicAnalyser runs the dynamic analyser.
|
||||||
|
func runDynamicAnalyser(args *u.Arguments, programName, programPath,
|
||||||
|
outFolder string, data *u.Data) {
|
||||||
|
|
||||||
|
dynamicAnalyser(args, data, programPath)
|
||||||
|
|
||||||
|
// Save dynamic Data into text file if display mode is set
|
||||||
|
if *args.BoolArg[saveOutputArg] {
|
||||||
|
|
||||||
|
// Create the folder 'output/dynamic' if it does not exist
|
||||||
|
outFolderDynamic := outFolder + "dynamic" + u.SEP
|
||||||
|
if _, err := u.CreateFolder(outFolderDynamic); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn := outFolderDynamic + programName + ".txt"
|
||||||
|
headersStr := []string{"Shared libraries list:", "System calls list:",
|
||||||
|
"Symbols list:"}
|
||||||
|
|
||||||
|
if err := u.RecordDataTxt(fn, headersStr, data.DynamicData); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
u.PrintOk("Data saved into " + fn)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// saveGraph saves dependency graphs of a given app into the output folder.
|
||||||
|
func saveGraph(programName, outFolder string, data *u.Data) {
|
||||||
|
|
||||||
|
if len(data.StaticData.SharedLibs) > 0 {
|
||||||
|
u.GenerateGraph(programName, outFolder+"static"+u.SEP+
|
||||||
|
programName+"_shared_libs", data.StaticData.SharedLibs, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(data.StaticData.Dependencies) > 0 {
|
||||||
|
u.GenerateGraph(programName, outFolder+"static"+u.SEP+
|
||||||
|
programName+"_dependencies", data.StaticData.Dependencies, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(data.StaticData.SharedLibs) > 0 {
|
||||||
|
u.GenerateGraph(programName, outFolder+"dynamic"+u.SEP+
|
||||||
|
programName+"_shared_libs", data.DynamicData.SharedLibs, nil)
|
||||||
|
}
|
||||||
|
}
|
232
srcs/dependtool/static_analyser.go
Normal file
232
srcs/dependtool/static_analyser.go
Normal file
|
@ -0,0 +1,232 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"runtime"
|
||||||
|
"strings"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ---------------------------------Gather Data---------------------------------
|
||||||
|
|
||||||
|
// gatherStaticSymbols gathers symbols of a given application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func gatherStaticSymbols(programPath string, data *u.StaticData) error {
|
||||||
|
|
||||||
|
// Use 'readelf' to get symbols
|
||||||
|
if output, err := u.ExecuteCommand("readelf", []string{"-s",
|
||||||
|
programPath}); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
parseReadELF(output, data)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// gatherStaticSymbols gathers system calls of a given application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func gatherStaticSystemCalls(programPath, argument string, data *u.StaticData) error {
|
||||||
|
|
||||||
|
var args []string
|
||||||
|
if len(argument) > 0 {
|
||||||
|
args = []string{argument, programPath}
|
||||||
|
} else {
|
||||||
|
args = []string{programPath}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use 'nm' to get symbols and system calls
|
||||||
|
if output, err := u.ExecuteCommand("nm", args); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
parseNMLinux(output, data)
|
||||||
|
} else {
|
||||||
|
parseNMMac(output, data)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// gatherStaticSymbols gathers shared libs of a given application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func gatherStaticSharedLibsLinux(programPath string, data *u.StaticData,
|
||||||
|
v bool) error {
|
||||||
|
|
||||||
|
// Use 'ldd' to get shared libraries
|
||||||
|
if output, err := u.ExecutePipeCommand("ldd " + programPath +
|
||||||
|
" | awk '/ => / { print $1,$3 }'"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
// Init SharedLibs
|
||||||
|
lddGlMap := make(map[string][]string)
|
||||||
|
_ = parseLDD(output, data.SharedLibs, lddGlMap, v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func gatherStaticSharedLibsMac(programPath string, data *u.StaticData,
|
||||||
|
v bool) error {
|
||||||
|
|
||||||
|
// Use 'ldd' to get shared libraries
|
||||||
|
if output, err := u.ExecutePipeCommand("otool -L " + programPath +
|
||||||
|
" | awk '{ print $1 }'"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
// Init SharedLibs
|
||||||
|
lddGlMap := make(map[string][]string)
|
||||||
|
_ = parseLDDMac(output, data.SharedLibs, lddGlMap, v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// gatherDependencies gathers dependencies of a given application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func gatherDependencies(programName string, data *u.StaticData, v bool) error {
|
||||||
|
|
||||||
|
// Use 'apt-cache pkgnames' to get the name of the package
|
||||||
|
output, err := u.ExecuteCommand("apt-cache",
|
||||||
|
[]string{"pkgnames", programName})
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// If the name of the package is know, execute apt-cache depends
|
||||||
|
if len(output) > 0 {
|
||||||
|
// Parse package name
|
||||||
|
packageName := parsePackagesName(output)
|
||||||
|
|
||||||
|
if len(packageName) > 0 {
|
||||||
|
return executeDependAptCache(packageName, data, v)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Enter manually the name of the package
|
||||||
|
u.PrintWarning(programName + " not found in apt-cache")
|
||||||
|
var output string
|
||||||
|
for len(output) == 0 {
|
||||||
|
fmt.Print("Please enter manually the name of the package " +
|
||||||
|
"(empty string to exit): ")
|
||||||
|
scanner := bufio.NewScanner(os.Stdin)
|
||||||
|
if err := scanner.Err(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
if scanner.Scan() {
|
||||||
|
|
||||||
|
// Get the new package name
|
||||||
|
input := scanner.Text()
|
||||||
|
if input == "" {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
output, err = u.ExecuteCommand("apt-cache",
|
||||||
|
[]string{"pkgnames", input})
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(output) == 0 {
|
||||||
|
u.PrintWarning("Skip dependencies analysis from apt-cache depends")
|
||||||
|
} else {
|
||||||
|
packageName := parsePackagesName(output)
|
||||||
|
return executeDependAptCache(packageName, data, v)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// executeDependAptCache gathers dependencies by executing 'apt-cache depends'.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func executeDependAptCache(programName string, data *u.StaticData,
|
||||||
|
fullDeps bool) error {
|
||||||
|
|
||||||
|
// Use 'apt-cache depends' to get dependencies
|
||||||
|
if output, err := u.ExecutePipeCommand("apt-cache depends " +
|
||||||
|
programName + " | awk '/Depends/ { print $2 }'"); err != nil {
|
||||||
|
return err
|
||||||
|
} else {
|
||||||
|
// Init Dependencies (from apt cache depends)
|
||||||
|
data.Dependencies = make(map[string][]string)
|
||||||
|
dependenciesMap := make(map[string][]string)
|
||||||
|
printDep := make(map[string][]string)
|
||||||
|
_ = parseDependencies(output, data.Dependencies, dependenciesMap,
|
||||||
|
printDep, fullDeps, 0)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Println("----------------------------------------------")
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// -------------------------------------Run-------------------------------------
|
||||||
|
|
||||||
|
// staticAnalyser runs the static analysis to get shared libraries,
|
||||||
|
// system calls and library calls of a given application.
|
||||||
|
//
|
||||||
|
func staticAnalyser(args u.Arguments, data *u.Data, programPath string) {
|
||||||
|
|
||||||
|
programName := *args.StringArg[programArg]
|
||||||
|
fullDeps := *args.BoolArg[fullDepsArg]
|
||||||
|
|
||||||
|
staticData := &data.StaticData
|
||||||
|
|
||||||
|
// If the program is a binary, runs static analysis tools
|
||||||
|
if len(programPath) > 0 {
|
||||||
|
// Gather Data from binary file
|
||||||
|
|
||||||
|
// Init symbols members
|
||||||
|
staticData.Symbols = make(map[string]string)
|
||||||
|
staticData.SystemCalls = make(map[string]string)
|
||||||
|
staticData.SharedLibs = make(map[string][]string)
|
||||||
|
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
u.PrintHeader2("(*) Gathering symbols from binary file")
|
||||||
|
if err := gatherStaticSymbols(programPath, staticData); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
u.PrintHeader2("(*) Gathering symbols & system calls from binary file")
|
||||||
|
if err := gatherStaticSystemCalls(programPath, "-D", staticData); err != nil {
|
||||||
|
// Check without the dynamic argument
|
||||||
|
if err := gatherStaticSystemCalls(programPath, "", staticData); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
u.PrintHeader2("(*) Gathering shared libraries from binary file")
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
if err := gatherStaticSharedLibsLinux(programPath, staticData,
|
||||||
|
fullDeps); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if err := gatherStaticSharedLibsMac(programPath, staticData,
|
||||||
|
fullDeps); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.ToLower(runtime.GOOS) == "linux" {
|
||||||
|
// Gather Data from apt-cache
|
||||||
|
u.PrintHeader2("(*) Gathering dependencies from apt-cache depends")
|
||||||
|
if err := gatherDependencies(programName, staticData, fullDeps); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
141
srcs/dependtool/system_calls.go
Normal file
141
srcs/dependtool/system_calls.go
Normal file
|
@ -0,0 +1,141 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
// InitSystemCalls initialises all Linux system calls.
|
||||||
|
//
|
||||||
|
// It returns a map of all system calls.
|
||||||
|
func initSystemCalls() map[string]*string {
|
||||||
|
return map[string]*string{"_llseek": nil, "_newselect": nil, "_sysctl": nil,
|
||||||
|
"accept": nil, "accept4": nil, "access": nil, "acct": nil,
|
||||||
|
"add_key": nil, "adjtimex": nil, "alarm": nil, "alloc_hugepages": nil,
|
||||||
|
"arc_gettls": nil, "arc_settls": nil, "arc_usr_cmpxchg": nil,
|
||||||
|
"arch_prctl": nil, "atomic_barrier": nil, "atomic_cmpxchg_32": nil,
|
||||||
|
"bdflush": nil, "bfin_spinlock": nil, "bind": nil, "bpf": nil,
|
||||||
|
"brk": nil, "breakpoint": nil, "cacheflush": nil, "capget": nil,
|
||||||
|
"capset": nil, "chdir": nil, "chmod": nil, "chown": nil, "chown32": nil,
|
||||||
|
"chroot": nil, "clock_adjtime": nil, "clock_getres": nil,
|
||||||
|
"clock_gettime": nil, "clock_nanosleep": nil, "connect": nil,
|
||||||
|
"copy_file_range": nil, "creat": nil, "create_module": nil,
|
||||||
|
"delete_module": nil, "dma_memcpy": nil, "dup": nil, "dup2": nil,
|
||||||
|
"dup3": nil, "epoll_create": nil, "epoll_create1": nil,
|
||||||
|
"epoll_ctl": nil, "epoll_pwait": nil, "epoll_wait": nil, "eventfd": nil,
|
||||||
|
"eventfd2": nil, "execv": nil, "execve": nil, "execveat": nil,
|
||||||
|
"exit": nil, "exit_group": nil, "faccessat": nil, "fadvise64": nil,
|
||||||
|
"fadvise64_64": nil, "fallocate": nil, "fanotify_init": nil,
|
||||||
|
"fanotify_mark": nil, "fchdir": nil, "fchmod": nil,
|
||||||
|
"fchmodat": nil, "fchown": nil, "fchown32": nil, "fchownat": nil,
|
||||||
|
"fcntl": nil, "fcntl64": nil, "fdatasync": nil, "fgetxattr": nil,
|
||||||
|
"finit_module": nil, "flistxattr": nil, "flock": nil, "fork": nil,
|
||||||
|
"free_hugepages": nil, "fremovexattr": nil, "fsetxattr": nil,
|
||||||
|
"fstat": nil, "fstat64": nil, "fstatat64": nil, "fstatfs": nil,
|
||||||
|
"fstatfs64": nil, "fsync": nil, "ftruncate": nil, "ftruncate64": nil,
|
||||||
|
"futex": nil, "futimesat": nil, "get_kernel_syms": nil,
|
||||||
|
"get_mempolicy": nil, "get_robust_list": nil, "get_thread_area": nil,
|
||||||
|
"get_tls": nil, "getcpu": nil, "getcwd": nil, "getdents": nil,
|
||||||
|
"getdents64": nil, "getdomainname": nil, "getdtablesize": nil,
|
||||||
|
"getegid": nil, "getegid32": nil, "geteuid": nil, "geteuid32": nil,
|
||||||
|
"getgid": nil, "getgid32": nil, "getgroups": nil, "getgroups32": nil,
|
||||||
|
"gethostname": nil, "getitimer": nil, "getpeername": nil,
|
||||||
|
"getpagesize": nil, "getpgid": nil, "getpgrp": nil, "getpid": nil,
|
||||||
|
"getppid": nil, "getpriority": nil, "getrandom": nil, "getresgid": nil,
|
||||||
|
"getresgid32": nil, "getresuid": nil, "getresuid32": nil,
|
||||||
|
"getrlimit": nil, "getrusage": nil, "getsid": nil, "getsockname": nil,
|
||||||
|
"getsockopt": nil, "gettid": nil, "gettimeofday": nil, "getuid": nil,
|
||||||
|
"getuid32": nil, "getunwind": nil, "getxattr": nil, "getxgid": nil,
|
||||||
|
"getxpid": nil, "getxuid": nil, "init_module": nil,
|
||||||
|
"inotify_add_watch": nil, "inotify_init": nil, "inotify_init1": nil,
|
||||||
|
"inotify_rm_watch": nil, "io_cancel": nil, "io_destroy": nil,
|
||||||
|
"io_getevents": nil, "io_pgetevents": nil, "io_setup": nil,
|
||||||
|
"io_submit": nil, "ioctl": nil, "ioperm": nil, "iopl": nil,
|
||||||
|
"ioprio_get": nil, "ioprio_set": nil, "ipc": nil, "kcmp": nil,
|
||||||
|
"kern_features": nil, "kexec_file_load": nil, "kexec_load": nil,
|
||||||
|
"keyctl": nil, "kill": nil, "lchown": nil, "lchown32": nil,
|
||||||
|
"lgetxattr": nil, "link": nil, "linkat": nil, "listen": nil,
|
||||||
|
"listxattr": nil, "llistxattr": nil, "lookup_dcookie": nil,
|
||||||
|
"lremovexattr": nil, "lseek": nil, "lsetxattr": nil, "lstat": nil,
|
||||||
|
"lstat64": nil, "madvise": nil, "mbind": nil, "memory_ordering": nil,
|
||||||
|
"metag_get_tls": nil, "metag_set_fpu_flags": nil, "metag_set_tls": nil,
|
||||||
|
"metag_setglobalbit": nil, "membarrier": nil, "memfd_create": nil,
|
||||||
|
"migrate_pages": nil, "mincore": nil, "mkdir": nil,
|
||||||
|
"mkdirat": nil, "mknod": nil, "mknodat": nil, "mlock": nil,
|
||||||
|
"mlock2": nil, "mlockall": nil, "mmap": nil, "mmap2": nil,
|
||||||
|
"modify_ldt": nil, "mount": nil, "move_pages": nil, "mprotect": nil,
|
||||||
|
"mq_getsetattr": nil, "mq_notify": nil, "mq_open": nil,
|
||||||
|
"mq_timedreceive": nil, "mq_timedsend": nil, "mq_unlink": nil,
|
||||||
|
"mremap": nil, "msgctl": nil, "msgget": nil, "msgrcv": nil,
|
||||||
|
"msgsnd": nil, "msync": nil, "munlock": nil, "munlockall": nil,
|
||||||
|
"munmap": nil, "name_to_handle_at": nil, "nanosleep": nil,
|
||||||
|
"newfstatat": nil, "nfsservctl": nil, "nice": nil, "old_adjtimex": nil,
|
||||||
|
"old_getrlimit": nil, "oldfstat": nil, "oldlstat": nil,
|
||||||
|
"oldolduname": nil, "oldstat": nil, "oldumount": nil, "olduname": nil,
|
||||||
|
"open": nil, "open_by_handle_at": nil, "openat": nil,
|
||||||
|
"or1k_atomic": nil, "pause": nil, "pciconfig_iobase": nil,
|
||||||
|
"pciconfig_read": nil, "pciconfig_write": nil, "perf_event_open": nil,
|
||||||
|
"personality": nil, "perfctr": nil, "perfmonctl": nil, "pipe": nil,
|
||||||
|
"pipe2": nil, "pivot_root": nil, "pkey_alloc": nil, "pkey_free": nil,
|
||||||
|
"pkey_mprotect": nil, "poll": nil, "ppoll": nil, "prctl": nil,
|
||||||
|
"pread": nil, "pread64": nil, "preadv": nil, "preadv2": nil,
|
||||||
|
"prlimit64": nil, "process_vm_readv": nil, "process_vm_writev": nil,
|
||||||
|
"pselect6": nil, "ptrace": nil, "pwrite": nil, "pwrite64": nil,
|
||||||
|
"pwritev": nil, "pwritev2": nil, "query_module": nil, "quotactl": nil,
|
||||||
|
"read": nil, "readahead": nil, "readdir": nil, "readlink": nil,
|
||||||
|
"readlinkat": nil, "readv": nil, "reboot": nil, "recv": nil,
|
||||||
|
"recvfrom": nil, "recvmsg": nil, "recvmmsg": nil,
|
||||||
|
"remap_file_pages": nil, "removexattr": nil, "rename": nil,
|
||||||
|
"renameat": nil, "renameat2": nil, "request_key": nil,
|
||||||
|
"restart_syscall": nil, "riscv_flush_icache": nil, "rmdir": nil,
|
||||||
|
"rseq": nil, "rt_sigaction": nil, "rt_sigpending": nil,
|
||||||
|
"rt_sigprocmask": nil, "rt_sigqueueinfo": nil, "rt_sigreturn": nil,
|
||||||
|
"rt_sigsuspend": nil, "rt_sigtimedwait": nil, "rt_tgsigqueueinfo": nil,
|
||||||
|
"rtas": nil, "s390_runtime_instr": nil, "s390_pci_mmio_read": nil,
|
||||||
|
"s390_pci_mmio_write": nil, "s390_sthyi": nil,
|
||||||
|
"s390_guarded_storage": nil, "sched_get_affinity": nil,
|
||||||
|
"sched_get_priority_max": nil, "sched_get_priority_min": nil,
|
||||||
|
"sched_getaffinity": nil, "sched_getattr": nil, "sched_getparam": nil,
|
||||||
|
"sched_getscheduler": nil, "sched_rr_get_interval": nil,
|
||||||
|
"sched_set_affinity": nil, "sched_setaffinity": nil,
|
||||||
|
"sched_setattr": nil, "sched_setparam": nil, "sched_setscheduler": nil,
|
||||||
|
"sched_yield": nil, "seccomp": nil, "select": nil, "semctl": nil,
|
||||||
|
"semget": nil, "semop": nil, "semtimedop": nil, "send": nil,
|
||||||
|
"sendfile": nil, "sendfile64": nil, "sendmmsg": nil, "sendmsg": nil,
|
||||||
|
"sendto": nil, "set_mempolicy": nil, "set_robust_list": nil,
|
||||||
|
"set_thread_area": nil, "set_tid_address": nil, "set_tls": nil,
|
||||||
|
"setdomainname": nil, "setfsgid": nil, "setfsgid32": nil,
|
||||||
|
"setfsuid": nil, "setfsuid32": nil, "setgid": nil, "setgid32": nil,
|
||||||
|
"setgroups": nil, "setgroups32": nil, "sethae": nil, "sethostname": nil,
|
||||||
|
"setitimer": nil, "setns": nil, "setpgid": nil, "setpgrp": nil,
|
||||||
|
"setpriority": nil, "setregid": nil, "setregid32": nil,
|
||||||
|
"setresgid": nil, "setresgid32": nil, "setresuid": nil,
|
||||||
|
"setresuid32": nil, "setreuid": nil, "setreuid32": nil,
|
||||||
|
"setrlimit": nil, "setsid": nil, "setsockopt": nil, "settimeofday": nil,
|
||||||
|
"setuid": nil, "setuid32": nil, "setup": nil, "setxattr": nil,
|
||||||
|
"sgetmask": nil, "shmat": nil, "shmctl": nil, "shmdt": nil,
|
||||||
|
"shmget": nil, "shutdown": nil, "sigaction": nil, "sigaltstack": nil,
|
||||||
|
"signal": nil, "signalfd": nil, "signalfd4": nil, "sigpending": nil,
|
||||||
|
"sigprocmask": nil, "sigreturn": nil, "sigsuspend": nil, "socket": nil,
|
||||||
|
"socketcall": nil, "socketpair": nil, "spill": nil, "splice": nil,
|
||||||
|
"spu_create": nil, "spu_run": nil, "sram_alloc": nil, "sram_free": nil,
|
||||||
|
"ssetmask": nil, "stat": nil, "stat64": nil, "statfs": nil,
|
||||||
|
"statfs64": nil, "statx": nil, "stime": nil, "subpage_prot": nil,
|
||||||
|
"switch_endian": nil, "swapcontext": nil, "swapoff": nil, "swapon": nil,
|
||||||
|
"symlink": nil, "symlinkat": nil, "sync": nil, "sync_file_range": nil,
|
||||||
|
"sync_file_range2": nil, "syncfs": nil, "sys_debug_setcontext": nil,
|
||||||
|
"syscall": nil, "sysfs": nil, "sysinfo": nil, "syslog": nil,
|
||||||
|
"sysmips": nil, "tee": nil, "tgkill": nil, "time": nil,
|
||||||
|
"timer_create": nil, "timer_delete": nil, "timer_getoverrun": nil,
|
||||||
|
"timer_gettime": nil, "timer_settime": nil,
|
||||||
|
"timerfd_create": nil, "timerfd_gettime": nil, "timerfd_settime": nil,
|
||||||
|
"times": nil, "tkill": nil, "truncate": nil, "truncate64": nil,
|
||||||
|
"ugetrlimit": nil, "umask": nil, "umount": nil, "umount2": nil,
|
||||||
|
"uname": nil, "unlink": nil, "unlinkat": nil, "unshare": nil,
|
||||||
|
"uselib": nil, "ustat": nil, "userfaultfd": nil, "usr26": nil,
|
||||||
|
"usr32": nil, "utime": nil, "utimensat": nil, "utimes": nil,
|
||||||
|
"utrap_install": nil, "vfork": nil, "vhangup": nil, "vm86old": nil,
|
||||||
|
"vm86": nil, "vmsplice": nil, "wait4": nil, "waitid": nil,
|
||||||
|
"waitpid": nil, "write": nil, "writev": nil, "xtensa": nil}
|
||||||
|
}
|
263
srcs/dependtool/tester.go
Normal file
263
srcs/dependtool/tester.go
Normal file
|
@ -0,0 +1,263 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
package dependtool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"math/rand"
|
||||||
|
"net"
|
||||||
|
"os"
|
||||||
|
"os/exec"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"syscall"
|
||||||
|
"time"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
stdinTest = iota
|
||||||
|
execTest
|
||||||
|
telnetTest
|
||||||
|
externalTesting
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
stdinTestString = "stdin"
|
||||||
|
execTestString = "exec"
|
||||||
|
telnetTestString = "telnet"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
timeOutMs = 3000
|
||||||
|
startupSec = 3
|
||||||
|
)
|
||||||
|
|
||||||
|
type Testing struct {
|
||||||
|
TypeTest string `json:"typeTest"`
|
||||||
|
TimeOutTest int `json:"timeOutMsTest"`
|
||||||
|
|
||||||
|
// Only for telnet test
|
||||||
|
AddressTelnet string `json:"addressTelnet"`
|
||||||
|
PortTelnet int `json:"portTelnet"`
|
||||||
|
|
||||||
|
TimeCommand int32 `json:"timeMsCommand"`
|
||||||
|
ListCommands []string `json:"listCommands"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// checkTypeTest checks the type of testing: exec, stdin and telnet tests.
|
||||||
|
//
|
||||||
|
// It returns an integer which represents the type of tests
|
||||||
|
func checkTypeTest(testStruct *Testing) int {
|
||||||
|
|
||||||
|
if testStruct == nil {
|
||||||
|
return externalTesting
|
||||||
|
}
|
||||||
|
|
||||||
|
if strings.Compare(testStruct.TypeTest, stdinTestString) == 0 {
|
||||||
|
return stdinTest
|
||||||
|
} else if strings.Compare(testStruct.TypeTest, execTestString) == 0 {
|
||||||
|
return execTest
|
||||||
|
} else if strings.Compare(testStruct.TypeTest, telnetTestString) == 0 {
|
||||||
|
return telnetTest
|
||||||
|
}
|
||||||
|
|
||||||
|
return externalTesting
|
||||||
|
}
|
||||||
|
|
||||||
|
// setDurationTimeOut sets the duration of the timeout by computing a reference
|
||||||
|
// value. In addition an extra margin value is added (3sec).
|
||||||
|
//
|
||||||
|
// It returns a duration either in milliseconds or in seconds.
|
||||||
|
func setDurationTimeOut(t *Testing, dArgs DynamicArgs) time.Duration {
|
||||||
|
|
||||||
|
if checkTypeTest(t) != externalTesting {
|
||||||
|
// Compute the number of commands + execution time (+ 3 seconds safe margin)
|
||||||
|
totalMs := t.TimeCommand*int32(len(t.ListCommands)) + timeOutMs
|
||||||
|
return time.Duration(totalMs) * time.Millisecond
|
||||||
|
}
|
||||||
|
|
||||||
|
return time.Duration(dArgs.waitTime+startupSec) * time.Second
|
||||||
|
}
|
||||||
|
|
||||||
|
// runCommandTester run commands and captures stdout and stderr of a the
|
||||||
|
// executed command. It will also run the Tester to explore several execution
|
||||||
|
// paths of the given app.
|
||||||
|
//
|
||||||
|
// It returns to string which are respectively stdout and stderr.
|
||||||
|
func runCommandTester(programPath, programName, command, option string,
|
||||||
|
testStruct *Testing, dArgs DynamicArgs, data *u.DynamicData) (string, string) {
|
||||||
|
|
||||||
|
timeOut := setDurationTimeOut(testStruct, dArgs)
|
||||||
|
u.PrintInfo("Duration of " + programName + " : " + timeOut.String())
|
||||||
|
ctx, cancel := context.WithTimeout(context.Background(), timeOut)
|
||||||
|
defer cancel()
|
||||||
|
|
||||||
|
args := strings.Fields("-f " + programPath + " " + option)
|
||||||
|
cmd := exec.CommandContext(ctx, command, args...)
|
||||||
|
cmd.SysProcAttr = &syscall.SysProcAttr{Setpgid: true}
|
||||||
|
|
||||||
|
bufOut, bufErr, bufIn := &bytes.Buffer{}, &bytes.Buffer{}, &bytes.Buffer{}
|
||||||
|
cmd.Stdout = bufOut // Add io.MultiWriter(os.Stdout) to record on stdout
|
||||||
|
cmd.Stderr = bufErr // Add io.MultiWriter(os.Stderr) to record on stderr
|
||||||
|
|
||||||
|
if checkTypeTest(testStruct) == stdinTest {
|
||||||
|
cmd.Stdin = os.Stdin
|
||||||
|
for _, cmd := range testStruct.ListCommands {
|
||||||
|
bufIn.Write([]byte(cmd))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run the process (traced by strace/ltrace)
|
||||||
|
if err := cmd.Start(); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run a go routine to handle the tests
|
||||||
|
go func() {
|
||||||
|
if checkTypeTest(testStruct) != stdinTest {
|
||||||
|
Tester(programName, cmd, data, testStruct, dArgs)
|
||||||
|
|
||||||
|
// Kill the program after the tester has finished the job
|
||||||
|
if err := u.PKill(programName, syscall.SIGINT); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
// Ignore the error because the program is killed (waitTime)
|
||||||
|
_ = cmd.Wait()
|
||||||
|
|
||||||
|
if ctx.Err() == context.DeadlineExceeded {
|
||||||
|
u.PrintInfo("Time out during executing: " + cmd.String())
|
||||||
|
return bufOut.String(), bufErr.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
return bufOut.String(), bufErr.String()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tester runs the executable file of a given application to perform tests to
|
||||||
|
// get program dependencies.
|
||||||
|
//
|
||||||
|
func Tester(programName string, cmd *exec.Cmd, data *u.DynamicData,
|
||||||
|
testStruct *Testing, dArgs DynamicArgs) {
|
||||||
|
|
||||||
|
if len(dArgs.testFile) > 0 {
|
||||||
|
// Wait until the program has started
|
||||||
|
time.Sleep(time.Second * startupSec)
|
||||||
|
u.PrintInfo("Run internal tests from file " + dArgs.testFile)
|
||||||
|
|
||||||
|
// Launch execution tests
|
||||||
|
if checkTypeTest(testStruct) == execTest {
|
||||||
|
launchTestsExternal(testStruct)
|
||||||
|
} else if checkTypeTest(testStruct) == telnetTest {
|
||||||
|
if len(testStruct.AddressTelnet) == 0 || testStruct.PortTelnet == 0 {
|
||||||
|
u.PrintWarning("Cannot find Address and port for telnet " +
|
||||||
|
"within json file. Skip tests")
|
||||||
|
} else {
|
||||||
|
launchTelnetTest(testStruct)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Waiting for external tests for " + strconv.Itoa(
|
||||||
|
dArgs.waitTime) + " sec")
|
||||||
|
ticker := time.Tick(time.Second)
|
||||||
|
for i := 1; i <= dArgs.waitTime; i++ {
|
||||||
|
<-ticker
|
||||||
|
fmt.Printf("-")
|
||||||
|
}
|
||||||
|
fmt.Printf("\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Gather shared libs
|
||||||
|
u.PrintHeader2("(*) Gathering shared libs")
|
||||||
|
if err := gatherDynamicSharedLibs(programName, cmd.Process.Pid, data,
|
||||||
|
dArgs.fullDeps); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
//----------------------------------Tests---------------------------------------
|
||||||
|
|
||||||
|
// launchTestsExternal runs external tests written in the 'test.json' file.
|
||||||
|
//
|
||||||
|
func launchTestsExternal(testStruct *Testing) {
|
||||||
|
|
||||||
|
for _, cmd := range testStruct.ListCommands {
|
||||||
|
if len(cmd) > 0 {
|
||||||
|
|
||||||
|
// Perform a sleep between command if specified
|
||||||
|
if testStruct.TimeCommand > 0 {
|
||||||
|
timeMs := rand.Int31n(testStruct.TimeCommand)
|
||||||
|
time.Sleep(time.Duration(timeMs) * time.Millisecond)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Execute each line as a command
|
||||||
|
if _, err := u.ExecutePipeCommand(cmd); err != nil {
|
||||||
|
u.PrintWarning("Impossible to execute test: " + cmd)
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Test executed: " + cmd)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// launchTelnetTest runs telnet tests written in the 'test.json' file.
|
||||||
|
//
|
||||||
|
func launchTelnetTest(testStruct *Testing) {
|
||||||
|
|
||||||
|
addr := testStruct.AddressTelnet + ":" + strconv.Itoa(testStruct.PortTelnet)
|
||||||
|
conn, _ := net.Dial("tcp", addr)
|
||||||
|
|
||||||
|
for _, cmd := range testStruct.ListCommands {
|
||||||
|
if len(cmd) > 0 {
|
||||||
|
|
||||||
|
// Perform a sleep between command if specified
|
||||||
|
if testStruct.TimeCommand > 0 {
|
||||||
|
timeMs := rand.Int31n(testStruct.TimeCommand)
|
||||||
|
time.Sleep(time.Duration(timeMs) * time.Millisecond)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set a timeout to avoid blocking
|
||||||
|
if err := conn.SetReadDeadline(
|
||||||
|
time.Now().Add(time.Duration(timeOutMs) * time.Millisecond)); err != nil {
|
||||||
|
u.PrintWarning("Impossible to set a timeout to TCP command")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send commands (test)
|
||||||
|
if _, err := fmt.Fprintf(conn, cmd+"\n"); err != nil {
|
||||||
|
u.PrintWarning("Impossible to execute test: " + cmd)
|
||||||
|
} else {
|
||||||
|
u.PrintInfo("Test executed: " + cmd)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read response
|
||||||
|
message := readerTelnet(conn)
|
||||||
|
fmt.Println("----->Message from server: " + message)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// readerTelnet reads data from the telnet connection.
|
||||||
|
//
|
||||||
|
func readerTelnet(conn net.Conn) (out string) {
|
||||||
|
var buffer [1]byte
|
||||||
|
recvData := buffer[:]
|
||||||
|
var n int
|
||||||
|
var err error
|
||||||
|
|
||||||
|
for {
|
||||||
|
n, err = conn.Read(recvData)
|
||||||
|
if n <= 0 || err != nil {
|
||||||
|
break
|
||||||
|
} else {
|
||||||
|
out += string(recvData)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return out
|
||||||
|
}
|
75
srcs/main.go
Normal file
75
srcs/main.go
Normal file
|
@ -0,0 +1,75 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os/user"
|
||||||
|
"tools/srcs/buildtool"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
"tools/srcs/crawlertool"
|
||||||
|
"tools/srcs/dependtool"
|
||||||
|
"tools/srcs/veriftool"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
|
||||||
|
// Init global arguments
|
||||||
|
args := new(u.Arguments)
|
||||||
|
parser, err := args.InitArguments()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse arguments
|
||||||
|
if err := args.ParseMainArguments(parser, args); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Checks if the toolchain must be completely executed
|
||||||
|
all := false
|
||||||
|
if !*args.BoolArg[u.DEP] && !*args.BoolArg[u.BUILD] &&
|
||||||
|
!*args.BoolArg[u.VERIF] && !*args.BoolArg[u.PERF] {
|
||||||
|
all = true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get user home folder
|
||||||
|
usr, err := user.Current()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var data *u.Data
|
||||||
|
|
||||||
|
if *args.BoolArg[u.CRAWLER] {
|
||||||
|
u.PrintHeader1("(*) RUN CRAWLER UNIKRAFT ANALYSER")
|
||||||
|
crawlertool.RunCrawler()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if all || *args.BoolArg[u.DEP] {
|
||||||
|
|
||||||
|
// Initialize data
|
||||||
|
data = new(u.Data)
|
||||||
|
|
||||||
|
u.PrintHeader1("(1) RUN DEPENDENCIES ANALYSER")
|
||||||
|
dependtool.RunAnalyserTool(usr.HomeDir, data)
|
||||||
|
}
|
||||||
|
|
||||||
|
if all || *args.BoolArg[u.BUILD] {
|
||||||
|
u.PrintHeader1("(2) AUTOMATIC BUILD TOOL")
|
||||||
|
buildtool.RunBuildTool(usr.HomeDir, data)
|
||||||
|
}
|
||||||
|
|
||||||
|
if all || *args.BoolArg[u.VERIF] {
|
||||||
|
u.PrintHeader1("(3) VERIFICATION TOOL")
|
||||||
|
veriftool.RunVerificationTool()
|
||||||
|
}
|
||||||
|
|
||||||
|
if all || *args.BoolArg[u.PERF] {
|
||||||
|
u.PrintHeader1("(4) PERFORMANCE OPTIMIZATION TOOL")
|
||||||
|
}
|
||||||
|
}
|
42
srcs/veriftool/args.go
Normal file
42
srcs/veriftool/args.go
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package veriftool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/akamensky/argparse"
|
||||||
|
"os"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
programArg = "program"
|
||||||
|
testFileArg = "testFile"
|
||||||
|
configFileArg = "configFile"
|
||||||
|
optionsArg = "options"
|
||||||
|
unikraftArg = "unikraft"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ParseArguments parses arguments of the application.
|
||||||
|
//
|
||||||
|
// It returns an error if any, otherwise it returns nil.
|
||||||
|
func parseLocalArguments(p *argparse.Parser, args *u.Arguments) error {
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "p", programArg,
|
||||||
|
&argparse.Options{Required: true, Help: "Program name"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "t", testFileArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the test file"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "c", configFileArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Path of the config file"})
|
||||||
|
args.InitArgParse(p, args, u.STRING, "o", optionsArg,
|
||||||
|
&argparse.Options{Required: false, Default: "", Help: "Extra options for " +
|
||||||
|
"launching program"})
|
||||||
|
|
||||||
|
args.InitArgParse(p, args, u.STRING, "u", unikraftArg,
|
||||||
|
&argparse.Options{Required: false, Help: "Unikraft Path"})
|
||||||
|
|
||||||
|
return u.ParserWrapper(p, os.Args)
|
||||||
|
}
|
149
srcs/veriftool/run_veriftool.go
Normal file
149
srcs/veriftool/run_veriftool.go
Normal file
|
@ -0,0 +1,149 @@
|
||||||
|
// Copyright 2019 The UNICORE Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file
|
||||||
|
//
|
||||||
|
// Author: Gaulthier Gain <gaulthier.gain@uliege.be>
|
||||||
|
|
||||||
|
package veriftool
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"github.com/sergi/go-diff/diffmatchpatch"
|
||||||
|
"io/ioutil"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
u "tools/srcs/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
const stdinCmd = "[STDIN]"
|
||||||
|
const testCmd = "[TEST]"
|
||||||
|
|
||||||
|
func RunVerificationTool() {
|
||||||
|
|
||||||
|
// Init and parse local arguments
|
||||||
|
args := new(u.Arguments)
|
||||||
|
p, err := args.InitArguments()
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
if err := parseLocalArguments(p, args); err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get program Name
|
||||||
|
programName := *args.StringArg[programArg]
|
||||||
|
|
||||||
|
// Take base path if absolute path is used
|
||||||
|
if filepath.IsAbs(programName) {
|
||||||
|
programName = filepath.Base(programName)
|
||||||
|
}
|
||||||
|
|
||||||
|
unikraftPath := *args.StringArg[unikraftArg]
|
||||||
|
if len(unikraftPath) == 0 {
|
||||||
|
u.PrintErr("Unikraft folder must exist! Run the build tool before " +
|
||||||
|
"using the verification tool")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the app folder
|
||||||
|
var appFolder string
|
||||||
|
if unikraftPath[len(unikraftPath)-1] != os.PathSeparator {
|
||||||
|
appFolder = unikraftPath + u.SEP + u.APPSFOLDER + programName + u.SEP
|
||||||
|
} else {
|
||||||
|
appFolder = unikraftPath + u.APPSFOLDER + programName + u.SEP
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the build folder
|
||||||
|
buildAppFolder := appFolder + u.BUILDFOLDER
|
||||||
|
|
||||||
|
// Get KVM image
|
||||||
|
var kvmUnikernel string
|
||||||
|
if file, err := u.OSReadDir(buildAppFolder); err != nil {
|
||||||
|
u.PrintWarning(err)
|
||||||
|
} else {
|
||||||
|
for _, f := range file {
|
||||||
|
if !f.IsDir() && strings.Contains(f.Name(), u.KVM_IMAGE) &&
|
||||||
|
len(filepath.Ext(f.Name())) == 0 {
|
||||||
|
kvmUnikernel = f.Name()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Kvm unikernel image
|
||||||
|
if len(kvmUnikernel) == 0 {
|
||||||
|
u.PrintWarning(errors.New("no KVM image found"))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read test
|
||||||
|
argStdin := ""
|
||||||
|
if len(*args.StringArg[testFileArg]) > 0 {
|
||||||
|
|
||||||
|
var err error
|
||||||
|
var cmdTests []string
|
||||||
|
cmdTests, err = u.ReadLinesFile(*args.StringArg[testFileArg])
|
||||||
|
if err != nil {
|
||||||
|
u.PrintWarning("Cannot find test files" + err.Error())
|
||||||
|
}
|
||||||
|
if strings.Contains(cmdTests[0], stdinCmd) {
|
||||||
|
argStdin = strings.Join(cmdTests[1:], "")
|
||||||
|
argStdin += "\n"
|
||||||
|
} else if strings.Contains(cmdTests[0], testCmd) {
|
||||||
|
//todo add for other tests
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test KVM app unikernel
|
||||||
|
unikernelFilename := appFolder + "output_" + kvmUnikernel + ".txt"
|
||||||
|
if err := testUnikernel(buildAppFolder+kvmUnikernel, unikernelFilename,
|
||||||
|
[]byte(argStdin)); err != nil {
|
||||||
|
u.PrintWarning("Impossible to write the output of verification to " +
|
||||||
|
unikernelFilename)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test general app
|
||||||
|
appFilename := appFolder + "output_" + programName + ".txt"
|
||||||
|
if err := testApp(programName, appFilename, []byte(argStdin)); err != nil {
|
||||||
|
u.PrintWarning("Impossible to write the output of verification to " +
|
||||||
|
unikernelFilename)
|
||||||
|
}
|
||||||
|
|
||||||
|
u.PrintInfo("Comparison output:")
|
||||||
|
|
||||||
|
// Compare both output
|
||||||
|
fmt.Println(compareOutput(unikernelFilename, appFilename))
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
func compareOutput(unikernelFilename, appFilename string) string {
|
||||||
|
f1, err := ioutil.ReadFile(unikernelFilename)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
f2, err := ioutil.ReadFile(appFilename)
|
||||||
|
if err != nil {
|
||||||
|
u.PrintErr(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
dmp := diffmatchpatch.New()
|
||||||
|
|
||||||
|
diffs := dmp.DiffMain(string(f2), string(f1), false)
|
||||||
|
|
||||||
|
return dmp.DiffPrettyText(diffs)
|
||||||
|
}
|
||||||
|
|
||||||
|
func testApp(programName, outputFile string, argsStdin []byte) error {
|
||||||
|
bOut, _ := u.ExecuteRunCmdStdin(programName, argsStdin)
|
||||||
|
|
||||||
|
return u.WriteToFile(outputFile, bOut)
|
||||||
|
}
|
||||||
|
|
||||||
|
func testUnikernel(kvmUnikernel, outputFile string, argsStdin []byte) error {
|
||||||
|
argsQemu := []string{"-nographic", "-vga", "none", "-device",
|
||||||
|
"isa-debug-exit", "-kernel", kvmUnikernel}
|
||||||
|
|
||||||
|
bOut, _ := u.ExecuteRunCmdStdin("qemu-system-x86_64", argsStdin, argsQemu...)
|
||||||
|
|
||||||
|
return u.WriteToFile(outputFile, bOut)
|
||||||
|
}
|
18
testfiles/queries_firebird.txt
Normal file
18
testfiles/queries_firebird.txt
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
display
|
||||||
|
z
|
||||||
|
mapping set
|
||||||
|
add test
|
||||||
|
delete test
|
||||||
|
add test
|
||||||
|
display test
|
||||||
|
modify test -pw newpassword
|
||||||
|
modify test -uid 1892
|
||||||
|
modify test -admin yes
|
||||||
|
add newuser -pw newuser -fname New -lname User
|
||||||
|
display newuser
|
||||||
|
add newadmin -pw secret -fname New -mname admin -lname User -admin yes
|
||||||
|
display newadmin
|
||||||
|
delete newuser
|
||||||
|
modify test -pw test
|
||||||
|
modify test -mname MiddleName -fname Fred
|
||||||
|
create database 'test' user 'test' password 'test';
|
85
testfiles/queries_influx.txt
Executable file
85
testfiles/queries_influx.txt
Executable file
|
@ -0,0 +1,85 @@
|
||||||
|
# DDL
|
||||||
|
CREATE DATABASE pirates
|
||||||
|
CREATE RETENTION POLICY oneday ON pirates DURATION 1d REPLICATION 1
|
||||||
|
|
||||||
|
# DML
|
||||||
|
# CONTEXT-DATABASE: pirates
|
||||||
|
# CONTEXT-RETENTION-POLICY: oneday
|
||||||
|
|
||||||
|
treasures,captain_id=dread_pirate_roberts value=801 1439856000
|
||||||
|
treasures,captain_id=flint value=29 1439856000
|
||||||
|
treasures,captain_id=sparrow value=38 1439856000
|
||||||
|
treasures,captain_id=tetra value=47 1439856000
|
||||||
|
treasures,captain_id=crunch value=109 1439858880
|
||||||
|
|
||||||
|
SHOW DATABASES
|
||||||
|
USE NOAA_water_database
|
||||||
|
SHOW measurements
|
||||||
|
SHOW RETENTION POLICIES ON NOAA_water_database
|
||||||
|
SHOW SERIES ON NOAA_water_database
|
||||||
|
SHOW SERIES ON NOAA_water_database FROM "h2o_quality" WHERE "location" = 'coyote_creek' LIMIT 2
|
||||||
|
SHOW MEASUREMENTS ON NOAA_water_database WITH MEASUREMENT =~ /h2o.*/ WHERE "randtag" =~ /\d/
|
||||||
|
SHOW TAG KEYS ON "NOAA_water_database"
|
||||||
|
SHOW FIELD KEYS ON "NOAA_water_database"
|
||||||
|
|
||||||
|
SELECT * FROM "h2o_feet"
|
||||||
|
SELECT "level description","location","water_level" FROM "h2o_feet"
|
||||||
|
SELECT "level description"::field,"location"::tag,"water_level"::field FROM "h2o_feet"
|
||||||
|
SELECT *::field FROM "h2o_feet"
|
||||||
|
SELECT * FROM "NOAA_water_database"."autogen"."h2o_feet"
|
||||||
|
SELECT * FROM "NOAA_water_database".."h2o_feet"
|
||||||
|
SELECT "location" FROM "h2o_feet"
|
||||||
|
SELECT "water_level","location" FROM "h2o_feet"
|
||||||
|
SELECT * FROM "h2o_feet" WHERE "water_level" > 8
|
||||||
|
SELECT * FROM "h2o_feet" WHERE "level description" = 'below 3 feet'
|
||||||
|
SELECT * FROM "h2o_feet" WHERE "water_level" + 2 > 11.9
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica'
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" <> 'santa_monica' AND (water_level < -0.59 OR water_level > 9.95)
|
||||||
|
SELECT * FROM "h2o_feet" WHERE time > now() - 7d
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica
|
||||||
|
SELECT "level description" FROM "h2o_feet" WHERE "level description" = "at or greater than 9 feet"
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet" GROUP BY "location"
|
||||||
|
SELECT MEAN("index") FROM "h2o_quality" GROUP BY location,randtag
|
||||||
|
SELECT MEAN("index") FROM "h2o_quality" GROUP BY *
|
||||||
|
SELECT "water_level","location" FROM "h2o_feet" WHERE time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:30:00Z'
|
||||||
|
SELECT COUNT("water_level") FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:30:00Z' GROUP BY time(12m)
|
||||||
|
SELECT COUNT("water_level") FROM "h2o_feet" WHERE time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:30:00Z' GROUP BY time(12m),"location"
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:18:00Z'
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:06:00Z' AND time <= '2015-08-18T00:54:00Z' GROUP BY time(18m)
|
||||||
|
SELECT COUNT("water_level") FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:06:00Z' AND time < '2015-08-18T00:18:00Z' GROUP BY time(12m,6m)
|
||||||
|
SELECT COUNT("water_level") FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:06:00Z' AND time < '2015-08-18T00:18:00Z' GROUP BY time(12m)
|
||||||
|
SELECT MAX("water_level") FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-09-18T16:00:00Z' AND time <= '2015-09-18T16:42:00Z' GROUP BY time(12m)
|
||||||
|
SELECT MAX("water_level") FROM "h2o_feet" WHERE location = 'coyote_creek' AND time >= '2015-09-18T16:36:00Z' AND time <= '2015-09-18T16:54:00Z' GROUP BY time(12m) fill(previous)
|
||||||
|
SELECT MEAN("tadpoles") FROM "pond" WHERE time > '2016-11-11T21:24:00Z' AND time <= '2016-11-11T22:06:00Z' GROUP BY time(12m) fill(linear)
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location"='coyote_creek' AND time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:54:00Z'
|
||||||
|
SELECT * FROM "h2o_feet","h2o_pH"
|
||||||
|
SELECT "water_level" INTO "h2o_feet_copy_1" FROM "h2o_feet" WHERE "location" = 'coyote_creek'
|
||||||
|
SELECT "water_level" INTO "where_else"."autogen"."h2o_feet_copy_2" FROM "h2o_feet" WHERE "location" = 'coyote_creek'
|
||||||
|
SELECT * FROM "all_my_averages"
|
||||||
|
SELECT * FROM "where_else"."autogen"./.*/
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica' ORDER BY time DESC
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet" WHERE time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:42:00Z' GROUP BY *,time(12m) LIMIT 2
|
||||||
|
SELECT "water_level" FROM "h2o_feet" GROUP BY * SLIMIT 1
|
||||||
|
SELECT "water_level","location" FROM "h2o_feet" LIMIT 3 OFFSET 3
|
||||||
|
SELECT "water_level" FROM "h2o_feet" GROUP BY * SLIMIT 1 SOFFSET 1
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica' AND time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:18:00Z' tz('America/Chicago')
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica' AND time >= '2015-08-18T00:00:00.000000000Z' AND time <= '2015-08-18T00:12:00Z'
|
||||||
|
SELECT "water_level" FROM "h2o_feet" WHERE "location" = 'santa_monica' AND time >= 1439856000s AND time <= 1439856720s
|
||||||
|
SELECT "level description" FROM "h2o_feet" WHERE time > '2015-09-18T21:18:00Z' AND time < now() + 1000d
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet" WHERE "location"='santa_monica' AND time >= '2015-09-18T21:30:00Z' GROUP BY time(12m) fill(none)
|
||||||
|
SELECT /l/ FROM "h2o_feet" LIMIT 1
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet" WHERE "location" =~ /./
|
||||||
|
SELECT MEAN("water_level") FROM "h2o_feet"
|
||||||
|
SELECT SUM("max") FROM (SELECT MAX("water_level") FROM "h2o_feet" GROUP BY "location")
|
||||||
|
SELECT DERIVATIVE(MEAN("water_level")) AS "water_level_derivative" FROM "h2o_feet" WHERE time >= '2015-08-18T00:00:00Z' AND time <= '2015-08-18T00:30:00Z' GROUP BY time(12m),"location"
|
||||||
|
|
||||||
|
SELECT * FROM "add" WHERE "A" + 5 > 10
|
||||||
|
SELECT "A" ^ 4294967295 FROM "data"
|
||||||
|
|
||||||
|
CREATE RETENTION POLICY "one_day_only" ON "NOAA_water_database" DURATION 23h60m REPLICATION 1 DEFAULT
|
||||||
|
ALTER RETENTION POLICY "what_is_time" ON "NOAA_water_database" DURATION 3w SHARD DURATION 2h DEFAULT
|
||||||
|
DROP RETENTION POLICY "what_is_time" ON "NOAA_water_database"
|
||||||
|
DROP SERIES FROM "h2o_feet" WHERE "location" = 'santa_monica'
|
||||||
|
DROP SERIES FROM "h2o_feet"
|
||||||
|
|
||||||
|
|
10
testfiles/queries_mongo1.js
Executable file
10
testfiles/queries_mongo1.js
Executable file
|
@ -0,0 +1,10 @@
|
||||||
|
use admin
|
||||||
|
db.dropUser("myUserAdmin")
|
||||||
|
db.createUser(
|
||||||
|
{
|
||||||
|
user: "myUserAdmin",
|
||||||
|
pwd: "abc123",
|
||||||
|
roles: [ { role: "userAdminAnyDatabase", db: "admin" }, "readWriteAnyDatabase" ]
|
||||||
|
}
|
||||||
|
)
|
||||||
|
exit
|
99
testfiles/queries_mongo2.js
Normal file
99
testfiles/queries_mongo2.js
Normal file
|
@ -0,0 +1,99 @@
|
||||||
|
db
|
||||||
|
use test
|
||||||
|
db.createUser(
|
||||||
|
{
|
||||||
|
user: "myTester",
|
||||||
|
pwd: "xyz123",
|
||||||
|
roles: [ { role: "readWrite", db: "test" },
|
||||||
|
{ role: "read", db: "reporting" } ]
|
||||||
|
}
|
||||||
|
)
|
||||||
|
db.foo.insert( { x: 1, y: 1 } )
|
||||||
|
show dbs
|
||||||
|
show collections
|
||||||
|
db.getCollectionNames();
|
||||||
|
db.printCollectionStats()
|
||||||
|
show users
|
||||||
|
show roles
|
||||||
|
show profile
|
||||||
|
show databases
|
||||||
|
use myNewDatabase
|
||||||
|
db.hostInfo()
|
||||||
|
db.myCollection.insertOne( { x: 1 } );
|
||||||
|
db.getCollection("3 test").find()
|
||||||
|
db.getCollection("3-test").find()
|
||||||
|
db.getCollection("stats").find()
|
||||||
|
db.inventory.insertMany([
|
||||||
|
// MongoDB adds the _id field with an ObjectId if _id is not present
|
||||||
|
{ item: "journal", qty: 25, status: "A",
|
||||||
|
size: { h: 14, w: 21, uom: "cm" }, tags: [ "blank", "red" ] },
|
||||||
|
{ item: "notebook", qty: 50, status: "A",
|
||||||
|
size: { h: 8.5, w: 11, uom: "in" }, tags: [ "red", "blank" ] },
|
||||||
|
{ item: "paper", qty: 100, status: "D",
|
||||||
|
size: { h: 8.5, w: 11, uom: "in" }, tags: [ "red", "blank", "plain" ] },
|
||||||
|
{ item: "planner", qty: 75, status: "D",
|
||||||
|
size: { h: 22.85, w: 30, uom: "cm" }, tags: [ "blank", "red" ] },
|
||||||
|
{ item: "postcard", qty: 45, status: "A",
|
||||||
|
size: { h: 10, w: 15.25, uom: "cm" }, tags: [ "blue" ] }
|
||||||
|
]);
|
||||||
|
db.inventory.find( {} )
|
||||||
|
db.inventory.find( { status: "D" } )
|
||||||
|
db.inventory.find( { size: { h: 14, w: 21, uom: "cm" } } )
|
||||||
|
db.inventory.find( { "size.uom": "in" } )
|
||||||
|
db.inventory.find( { tags: "red" } )
|
||||||
|
db.inventory.find( { tags: ["red", "blank"] } )
|
||||||
|
use myNewDB
|
||||||
|
db.myNewCollection1.insertOne( { x: 1 } )
|
||||||
|
db.myNewCollection2.insertOne( { x: 1 } )
|
||||||
|
db.myNewCollection3.createIndex( { y: 1 } )
|
||||||
|
db.runCommand( { create: <view>, viewOn: <source>, pipeline: <pipeline> } )
|
||||||
|
db.runCommand( { create: <view>, viewOn: <source>, pipeline: <pipeline>, collation: <collation> } )
|
||||||
|
db.createView(<view>, <source>, <pipeline>, <collation> )
|
||||||
|
db.view.find().sort({$natural: 1})
|
||||||
|
db.createCollection( "log", { capped: true, size: 100000 } )
|
||||||
|
db.createCollection("log", { capped : true, size : 5242880, max : 5000 } )
|
||||||
|
db.cappedCollection.find().sort( { $natural: -1 } )
|
||||||
|
db.collection.isCapped()
|
||||||
|
db.runCommand({"convertToCapped": "mycoll", size: 100000});
|
||||||
|
var mydoc = {
|
||||||
|
_id: ObjectId("5099803df3f4948bd2f98391"),
|
||||||
|
name: { first: "Alan", last: "Turing" },
|
||||||
|
birth: new Date('Jun 23, 1912'),
|
||||||
|
death: new Date('Jun 07, 1954'),
|
||||||
|
contribs: [ "Turing machine", "Turing test", "Turingery" ],
|
||||||
|
views : NumberLong(1250000)
|
||||||
|
}
|
||||||
|
var a = new Timestamp();
|
||||||
|
db.test.insertOne( { ts: a } );
|
||||||
|
{ "_id" : ObjectId("542c2b97bac0595474108b48"), "ts" : Timestamp(1412180887, 1) }
|
||||||
|
var mydate1 = new Date()
|
||||||
|
var mydate2 = ISODate()
|
||||||
|
mydate1.toString()
|
||||||
|
mydate1.getMonth()
|
||||||
|
{
|
||||||
|
locale: <string>,
|
||||||
|
caseLevel: <boolean>,
|
||||||
|
caseFirst: <string>,
|
||||||
|
strength: <int>,
|
||||||
|
numericOrdering: <boolean>,
|
||||||
|
alternate: <string>,
|
||||||
|
maxVariable: <string>,
|
||||||
|
backwards: <boolean>
|
||||||
|
}
|
||||||
|
{ "$binary": "<bindata>", "$type": "<t>" }
|
||||||
|
db.json.insert( { longQuoted : NumberLong("9223372036854775807") } )
|
||||||
|
db.json.insert( { longUnQuoted : NumberLong(9223372036854775807) } )
|
||||||
|
db.json.find()
|
||||||
|
db.json.insert( { decimalQuoted : NumberDecimal("123.40") } )
|
||||||
|
db.json.insert( { decimalUnQuoted : NumberDecimal(123.40) } )
|
||||||
|
db.json.find()
|
||||||
|
db.students.drop( { writeConcern: { w: "majority" } } )
|
||||||
|
db.students.drop()
|
||||||
|
db.printCollectionStats()
|
||||||
|
db.printReplicationInfo()
|
||||||
|
db.printShardingStatus()
|
||||||
|
db.printSlaveReplicationInfo()
|
||||||
|
db.repairDatabase()
|
||||||
|
db.resetError()
|
||||||
|
db.getMongo()
|
||||||
|
exit
|
141
testfiles/queries_mysql.txt
Executable file
141
testfiles/queries_mysql.txt
Executable file
|
@ -0,0 +1,141 @@
|
||||||
|
SELECT '<info_to_display>' AS ' ';
|
||||||
|
SHOW VARIABLES LIKE '%ssl%';
|
||||||
|
SHOW SESSION STATUS LIKE 'Ssl_cipher';
|
||||||
|
SET @plaintextpassword = 'hello';
|
||||||
|
SET @USER = 'hello';
|
||||||
|
SELECT UPPER(SHA1(UNHEX(SHA1(@plaintextpassword)))) PWD_CREATION;
|
||||||
|
SELECT PASSWORD(@plaintextpassword) PWD_FUNCTION;
|
||||||
|
SELECT User, Host FROM mysql.user;
|
||||||
|
SELECT authentication_string FROM mysql.user;
|
||||||
|
SHOW VARIABLES LIKE 'validate_password%';
|
||||||
|
#UNINSTALL PLUGIN validate_password;
|
||||||
|
DROP USER IF EXISTS 'test'@'localhost';
|
||||||
|
FLUSH privileges;
|
||||||
|
CREATE USER 'test'@'localhost' IDENTIFIED BY 'test';
|
||||||
|
GRANT ALL PRIVILEGES ON *.* TO 'test'@'localhost';
|
||||||
|
SHOW DATABASES;
|
||||||
|
DROP DATABASE IF EXISTS menagerie;
|
||||||
|
CREATE DATABASE menagerie;
|
||||||
|
USE menagerie;
|
||||||
|
SHOW TABLES;
|
||||||
|
DROP TABLE IF EXISTS pet;
|
||||||
|
SHOW TABLES;
|
||||||
|
CREATE TABLE pet (name VARCHAR(20), owner VARCHAR(20), species VARCHAR(20), sex CHAR(1), birth DATE, death DATE);
|
||||||
|
DESCRIBE pet;
|
||||||
|
INSERT INTO pet VALUES ('Fluffy','Harold','cat','f','1993-02-04', NULL);
|
||||||
|
INSERT INTO pet VALUES ('Claws','Gwen','cat','m','1994-03-17', NULL);
|
||||||
|
INSERT INTO pet VALUES ('Buffy','Harold','dog','f','1989-05-13',NULL);
|
||||||
|
INSERT INTO pet VALUES ('Fang',' Benny','dog','m','1990-08-27', NULL);
|
||||||
|
INSERT INTO pet VALUES ('Whistler','Gwen','bird',NULL,'1997-12-09', NULL);
|
||||||
|
INSERT INTO pet VALUES ('Slim','Benny','snake','m','1996-04-29', NULL);
|
||||||
|
SELECT * FROM pet;
|
||||||
|
DELETE FROM pet;
|
||||||
|
INSERT INTO pet VALUES ('Puffball','Diane','hamster','f','1999-03-30',NULL);
|
||||||
|
UPDATE pet SET birth = '1989-08-31' WHERE name = 'Bowser';
|
||||||
|
SELECT * FROM pet;
|
||||||
|
SELECT * FROM pet WHERE name = 'Bowser';
|
||||||
|
SELECT * FROM pet WHERE birth >= '1998-1-1';
|
||||||
|
SELECT * FROM pet WHERE species = 'dog' AND sex = 'f';
|
||||||
|
SELECT * FROM pet WHERE species = 'snake' OR species = 'bird';
|
||||||
|
SELECT * FROM pet WHERE (species = 'cat' AND sex = 'm') OR (species = 'dog' AND sex = 'f');
|
||||||
|
SELECT name, birth FROM pet;
|
||||||
|
SELECT owner FROM pet;
|
||||||
|
SELECT DISTINCT owner FROM pet;
|
||||||
|
SELECT name, species, birth FROM pet WHERE species = 'dog' OR species = 'cat';
|
||||||
|
SELECT name, birth FROM pet ORDER BY birth;
|
||||||
|
SELECT name, birth FROM pet ORDER BY birth DESC;
|
||||||
|
SELECT name, species, birth FROM pet ORDER BY species, birth DESC;
|
||||||
|
SELECT name, birth, CURDATE(), TIMESTAMPDIFF(YEAR,birth,CURDATE()) AS age FROM pet;
|
||||||
|
SELECT name, birth, CURDATE(), TIMESTAMPDIFF(YEAR,birth,CURDATE()) AS age FROM pet ORDER BY name;
|
||||||
|
SELECT name, birth, death, TIMESTAMPDIFF(YEAR,birth,death) AS age FROM pet WHERE death IS NOT NULL ORDER BY age;
|
||||||
|
SELECT name, birth, MONTH(birth) FROM pet;
|
||||||
|
SELECT name, birth FROM pet WHERE MONTH(birth) = 5;
|
||||||
|
SELECT name, birth FROM pet WHERE MONTH(birth) = MOD(MONTH(CURDATE()), 12) + 1;
|
||||||
|
SELECT '2018-10-31' + INTERVAL 1 DAY;
|
||||||
|
SELECT '2018-10-32' + INTERVAL 1 DAY;
|
||||||
|
SHOW WARNINGS;
|
||||||
|
SELECT 1 IS NULL, 1 IS NOT NULL;
|
||||||
|
SELECT 1 = NULL, 1 <> NULL, 1 < NULL, 1 > NULL;
|
||||||
|
SELECT 0 IS NULL, 0 IS NOT NULL, '' IS NULL, '' IS NOT NULL;
|
||||||
|
SELECT * FROM pet WHERE name LIKE 'b%';
|
||||||
|
SELECT * FROM pet WHERE name LIKE '%fy';
|
||||||
|
SELECT * FROM pet WHERE name LIKE '%w%';
|
||||||
|
SELECT * FROM pet WHERE name LIKE '_____';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP '^b';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP BINARY '^b';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP 'fy$';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP 'w';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP '^.....$';
|
||||||
|
SELECT * FROM pet WHERE name REGEXP '^.{5}$';
|
||||||
|
SELECT COUNT(*) FROM pet;
|
||||||
|
SELECT owner, COUNT(*) FROM pet GROUP BY owner;
|
||||||
|
SELECT species, COUNT(*) FROM pet GROUP BY species;
|
||||||
|
SELECT sex, COUNT(*) FROM pet GROUP BY sex;
|
||||||
|
SELECT species, sex, COUNT(*) FROM pet GROUP BY species, sex;
|
||||||
|
SELECT species, sex, COUNT(*) FROM pet WHERE species = 'dog' OR species = 'cat' GROUP BY species, sex;
|
||||||
|
SELECT species, sex, COUNT(*) FROM pet WHERE sex IS NOT NULL GROUP BY species, sex;
|
||||||
|
SET sql_mode = 'ONLY_FULL_GROUP_BY';
|
||||||
|
SET sql_mode = '';
|
||||||
|
SELECT owner, COUNT(*) FROM pet;
|
||||||
|
DROP TABLE IF EXISTS event;
|
||||||
|
SHOW TABLES;
|
||||||
|
CREATE TABLE event (name VARCHAR(20), date DATE, type VARCHAR(15), remark VARCHAR(255));
|
||||||
|
INSERT INTO event VALUES ('Fluffy','1995-05-15','litter','4 kittens, 3 female, 1 male');
|
||||||
|
INSERT INTO event VALUES ('Buffy','1993-06-23','litter','5 puppies, 2 female, 3 male');
|
||||||
|
INSERT INTO event VALUES ('Buffy','1994-06-19','litter','3 puppies, 3 female');
|
||||||
|
INSERT INTO event VALUES ('Chirpy','1999-03-21','vet','needed beak straightened');
|
||||||
|
INSERT INTO event VALUES ('Slim','1997-08-03','vet','broken rib');
|
||||||
|
INSERT INTO event VALUES ('Bowser','1991-10-12','kennel', NULL);
|
||||||
|
INSERT INTO event VALUES ('Fang','1991-10-12','kennel',NULL);
|
||||||
|
INSERT INTO event VALUES ('Fang','1998-08-28','birthday','Gave him a new chew toy');
|
||||||
|
INSERT INTO event VALUES ('Claws','1998-03-17','birthday','Gave him a new flea collar');
|
||||||
|
INSERT INTO event VALUES ('Whistler','1998-12-09','birthday','First birthday');
|
||||||
|
SELECT pet.name, TIMESTAMPDIFF(YEAR,birth,date) AS age, remark FROM pet INNER JOIN event ON pet.name = event.name WHERE event.type = 'litter';
|
||||||
|
SELECT p1.name, p1.sex, p2.name, p2.sex, p1.species FROM pet AS p1 INNER JOIN pet AS p2 ON p1.species = p2.species AND p1.sex = 'f' AND p1.death IS NULL AND p2.sex = 'm' AND p2.death IS NULL;
|
||||||
|
SELECT DATABASE();
|
||||||
|
DESCRIBE pet;
|
||||||
|
DROP TABLE IF EXISTS shop;
|
||||||
|
CREATE TABLE shop (article INT(4) UNSIGNED ZEROFILL DEFAULT '0000' NOT NULL, dealer CHAR(20) DEFAULT '' NOT NULL, price DOUBLE(16,2) DEFAULT '0.00' NOT NULL, PRIMARY KEY(article, dealer));
|
||||||
|
INSERT INTO shop VALUES (1,'A',3.45),(1,'B',3.99),(2,'A',10.99),(3,'B',1.45), (3,'C',1.69),(3,'D',1.25),(4,'D',19.95);
|
||||||
|
SELECT * FROM shop ORDER BY article;
|
||||||
|
SELECT MAX(article) AS article FROM shop;
|
||||||
|
SELECT article, dealer, price FROM shop WHERE price=(SELECT MAX(price) FROM shop);
|
||||||
|
SELECT s1.article, s1.dealer, s1.price FROM shop s1 LEFT JOIN shop s2 ON s1.price < s2.price WHERE s2.article IS NULL;
|
||||||
|
SELECT article, dealer, price FROM shop ORDER BY price DESC LIMIT 1;
|
||||||
|
SELECT article, MAX(price) AS price FROM shop GROUP BY article ORDER BY article;
|
||||||
|
SELECT article, dealer, price FROM shop s1 WHERE price=(SELECT MAX(s2.price) FROM shop s2 WHERE s1.article = s2.article) ORDER BY article;
|
||||||
|
SELECT s1.article, dealer, s1.price
|
||||||
|
FROM shop s1
|
||||||
|
JOIN (SELECT article, MAX(price) AS price FROM shop GROUP BY article) AS s2 ON s1.article = s2.article AND s1.price = s2.price ORDER BY article;
|
||||||
|
SELECT s1.article, s1.dealer, s1.price FROM shop s1 LEFT JOIN shop s2 ON s1.article = s2.article AND s1.price < s2.price WHERE s2.article IS NULL ORDER BY s1.article;
|
||||||
|
SELECT @min_price:=MIN(price),@max_price:=MAX(price) FROM shop;
|
||||||
|
SELECT * FROM shop WHERE price=@min_price OR price=@max_price;
|
||||||
|
CREATE TABLE person (id SMALLINT UNSIGNED NOT NULL AUTO_INCREMENT, name CHAR(60) NOT NULL, PRIMARY KEY (id));
|
||||||
|
CREATE TABLE shirt (id SMALLINT UNSIGNED NOT NULL AUTO_INCREMENT, style ENUM('t-shirt', 'polo', 'dress') NOT NULL, color ENUM('red', 'blue', 'orange', 'white', 'black') NOT NULL, owner SMALLINT UNSIGNED NOT NULL REFERENCES person(id), PRIMARY KEY (id));
|
||||||
|
INSERT INTO person VALUES (NULL, 'Antonio Paz');
|
||||||
|
SELECT @last := LAST_INSERT_ID();
|
||||||
|
INSERT INTO shirt VALUES(NULL, 'polo', 'blue', @last), (NULL, 'dress', 'white', @last), (NULL, 't-shirt', 'blue', @last);
|
||||||
|
INSERT INTO person VALUES (NULL, 'Lilliana Angelovska');
|
||||||
|
SELECT @last := LAST_INSERT_ID();
|
||||||
|
INSERT INTO shirt VALUES(NULL, 'dress', 'orange', @last),(NULL, 'polo', 'red', @last),(NULL, 'dress', 'blue', @last),(NULL, 't-shirt', 'white', @last);
|
||||||
|
SELECT * FROM person;
|
||||||
|
SELECT * FROM shirt;
|
||||||
|
SELECT s.* FROM person p INNER JOIN shirt s ON s.owner = p.id WHERE p.name LIKE 'Lilliana%' AND s.color <> 'white';
|
||||||
|
SHOW CREATE TABLE shirt\G
|
||||||
|
CREATE TABLE t1 (year YEAR(4), month INT(2) UNSIGNED ZEROFILL, day INT(2) UNSIGNED ZEROFILL);
|
||||||
|
INSERT INTO t1 VALUES(2000,1,1),(2000,1,20),(2000,1,30),(2000,2,2), (2000,2,23),(2000,2,23);
|
||||||
|
SELECT year,month,BIT_COUNT(BIT_OR(1<<day)) AS days FROM t1 GROUP BY year,month;
|
||||||
|
CREATE TABLE animals (id MEDIUMINT NOT NULL AUTO_INCREMENT, name CHAR(30) NOT NULL, PRIMARY KEY (id));
|
||||||
|
INSERT INTO animals (name) VALUES('dog'),('cat'),('penguin'), ('lax'),('whale'),('ostrich');
|
||||||
|
SELECT * FROM animals;
|
||||||
|
INSERT INTO animals (id,name) VALUES(0,'groundhog');
|
||||||
|
INSERT INTO animals (id,name) VALUES(NULL,'squirrel');
|
||||||
|
INSERT INTO animals (id,name) VALUES(100,'rabbit');
|
||||||
|
INSERT INTO animals (id,name) VALUES(NULL,'mouse');
|
||||||
|
SELECT * FROM animals;
|
||||||
|
DROP DATABASE IF EXISTS menagerie;
|
||||||
|
DROP USER IF EXISTS testSSL;
|
||||||
|
FLUSH PRIVILEGES;
|
||||||
|
#CREATE USER 'testSSL'@'localhost' IDENTIFIED BY 'testSSL' REQUIRE SSL;
|
||||||
|
GRANT ALL ON example.* TO 'testSSL'@'localhost';
|
||||||
|
FLUSH PRIVILEGES;
|
3907
testfiles/queries_postgres.sql
Executable file
3907
testfiles/queries_postgres.sql
Executable file
File diff suppressed because it is too large
Load diff
24
testfiles/test_avahi.json
Executable file
24
testfiles/test_avahi.json
Executable file
|
@ -0,0 +1,24 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"avahi-browse --all --ignore-local --resolve --terminate",
|
||||||
|
"nmap localhost",
|
||||||
|
"ping 8.8.8.8",
|
||||||
|
"avahi-browse --all --resolve --terminate",
|
||||||
|
"avahi-browse --all --resolve",
|
||||||
|
"avahi-browse --all ",
|
||||||
|
"avahi-browse-domains ",
|
||||||
|
"avahi-resolve-address 192.168.1.49",
|
||||||
|
"avahi-resolve-host-name dlanwireless.lan",
|
||||||
|
"avahi-resolve-host-name dlanwireless",
|
||||||
|
"avahi-resolve-host-name dlanwire",
|
||||||
|
"avahi-resolve-address 192.168.1",
|
||||||
|
"avahi-set-host-name dlanwireless",
|
||||||
|
"sudo avahi-set-host-name dlwireless",
|
||||||
|
"avahi-resolve-host-name dlwireless",
|
||||||
|
"sudo avahi-publish-address 192.168.1.111 hello",
|
||||||
|
"sudo avahi-publish-service dns DNS 53",
|
||||||
|
"sudo avahi-daemon -k"
|
||||||
|
]
|
||||||
|
}
|
33
testfiles/test_bind9.json
Executable file
33
testfiles/test_bind9.json
Executable file
|
@ -0,0 +1,33 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"dig unicore-project.eu +nssearch",
|
||||||
|
"sudo rndc dumpdb -cache",
|
||||||
|
"sudo rndc flush",
|
||||||
|
"sudo rndc status",
|
||||||
|
"sudo rndc reload",
|
||||||
|
"sudo rndc zonestatus unicore-project.eu",
|
||||||
|
"sudo rndc reload unicore-project.eu",
|
||||||
|
"sudo rndc refresh zone unicore-project.eu",
|
||||||
|
"sudo rndc refresh unicore-project.eu",
|
||||||
|
"sudo rndc status",
|
||||||
|
"sudo rndc thaw unicore-project.eu",
|
||||||
|
"sudo rndc thaw ",
|
||||||
|
"sudo rndc notify",
|
||||||
|
"sudo rndc dumpdb -all",
|
||||||
|
"sudo rndc sign unicore-project.eu",
|
||||||
|
"sudo rndc secroots",
|
||||||
|
"sudo rndc trace",
|
||||||
|
"sudo rndc recursing",
|
||||||
|
"sudo rndc freeze",
|
||||||
|
"sudo rndc freeze unicore-project.eu",
|
||||||
|
"sudo rndc notify unicore-project.eu",
|
||||||
|
"nslookup unicore-project.eu",
|
||||||
|
"nslookup 141.85.241.196",
|
||||||
|
"dig unicore-project.eu +nssearch",
|
||||||
|
"sudo rndc notrace",
|
||||||
|
"sudo rndc restart",
|
||||||
|
"sudo rndc stop"
|
||||||
|
]
|
||||||
|
}
|
22
testfiles/test_email.json
Executable file
22
testfiles/test_email.json
Executable file
|
@ -0,0 +1,22 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"swaks --to user@example.com --server localhost --port 25",
|
||||||
|
"swaks --to user@example.com --from localhost --auth CRAM-MD5 --auth-user me@example.com --header-X-Test \"test email\", --server localhost --port 25",
|
||||||
|
"swaks -t user@example.com --attach - --server localhost --port 25 --suppress-data </path/to/eicar.txt",
|
||||||
|
"swaks --to user@example.com --body \",de\" --server localhost --port 25",
|
||||||
|
"swaks --to user@example.com --socket /var/lda.sock --protocol LMTP --server localhost --port 25",
|
||||||
|
"swaks --to someone@somewhere.net --from postmaster@yourdomain.xy --server localhost --port 25 --ehlo test -tls --auth login --auth-user \",postmaster@yourdomain.xy --auth-password password",
|
||||||
|
"swaks --add-header \"X-Test-Header: foo\" --to someone@somewhere.net --from postmaster@yourdomain.xy --server localhost --port 25",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com,testme@example.com",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com --server mail.example.com",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com --quit-after RCPT",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com --body /path/to/gtube/file",
|
||||||
|
"swaks --server localhost --port 25 -f someone@example.net -t liquidat@example.com --body /path/to/eicar/file",
|
||||||
|
"swaks--server localhost --port 25 -tls example.com -f liquidat@example.com -t someone@example.net -ao --auth-user=liquidat",
|
||||||
|
"swaks --server localhost --port 25 -tls -s example.com -f someone@example.net -t liquidat@example.com --ehlo $(host $(wget \",http://automation.whatismyip.com/n09230945.asp -O - -q))",
|
||||||
|
"swaks -f someone@example.net -t liquidat@example.com --add-header \"X-Custom-Header: Swaks-Tested\""
|
||||||
|
]
|
||||||
|
}
|
7
testfiles/test_firebird.json
Executable file
7
testfiles/test_firebird.json
Executable file
|
@ -0,0 +1,7 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"gsec -user sysdba -password"
|
||||||
|
]
|
||||||
|
}
|
31
testfiles/test_http.json
Normal file
31
testfiles/test_http.json
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 1000,
|
||||||
|
"listCommands": [
|
||||||
|
"curl -v http://localhost:80",
|
||||||
|
"curl -v -k http://localhost:80",
|
||||||
|
"curl -v http://localhost:80/index.html",
|
||||||
|
"curl -v http://localhost:80/",
|
||||||
|
"curl -d 'id=0&name=test' http://localhost:80",
|
||||||
|
"curl -d '{\"id\":9,\",name\",:\",test\",}' -H 'Content-Type: application/json' http://localhost:80",
|
||||||
|
"curl -X DELETE http://localhost:80/test",
|
||||||
|
"curl -o newfile.tar.gz http://localhost:80/file.tar.gz",
|
||||||
|
"curl -C - -O http://localhost:80/file.tar.gz",
|
||||||
|
"curl -I http://localhost:80/",
|
||||||
|
"curl --data \"firstName=John&lastName=Doe\", http://localhost:80/info.php",
|
||||||
|
"curl -I http://localhost:80 --user-agent \"I am a new web browser\",",
|
||||||
|
"curl --cookie-jar cnncookies.txt http://localhost:80/ -O",
|
||||||
|
"curl --cookie cnncookies.txt http://localhost:80/",
|
||||||
|
"curl -s -O http://localhost:80/",
|
||||||
|
"curl --limit-rate 100K http://localhost:80/ -O",
|
||||||
|
"curl -OL http://localhost:80/",
|
||||||
|
"curl -I --http2 http://localhost:80/",
|
||||||
|
"curl -I --http2 -s http://localhost:80/ | grep HTTP",
|
||||||
|
"curl http://localhost:80/",
|
||||||
|
"curl -L http://localhost:80/",
|
||||||
|
"curl -A \"Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0\", http://localhost:80/",
|
||||||
|
"curl -u username:password http://localhost:80/",
|
||||||
|
"curl -v http://localhost:80/admin.php",
|
||||||
|
"curl http://localhost:80/tests/sample.html"
|
||||||
|
]
|
||||||
|
}
|
52
testfiles/test_https.json
Normal file
52
testfiles/test_https.json
Normal file
|
@ -0,0 +1,52 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"curl -k -v https://localhost:443",
|
||||||
|
"curl -k -v https://localhost:443/index.html",
|
||||||
|
"curl -k -v https://localhost:443",
|
||||||
|
"curl -k -d 'id=0&name=test' https://localhost:443",
|
||||||
|
"curl -k -d '{\"id\",:9,\"name\",:\",test\",}' -H 'Content-Type: application/json' https://localhost:443",
|
||||||
|
"curl -k -X DELETE https://localhost:443/test",
|
||||||
|
"curl -k -o newfile.tar.gz https://localhost:443/file.tar.gz",
|
||||||
|
"curl -k -C - -O https://localhost:443/file.tar.gz",
|
||||||
|
"curl -k -I https://localhost:443/",
|
||||||
|
"curl -k --data \"firstName=John&lastName=Doe\", https://localhost:443/info.php",
|
||||||
|
"curl -k -I https://localhost:443 --user-agent \"I am a new web browser\"",
|
||||||
|
"curl -k --cookie-jar cnncookies.txt https://localhost:443/ -O",
|
||||||
|
"curl -k --cookie cnncookies.txt https://localhost:443/",
|
||||||
|
"curl -k -s -O https://localhost:443/",
|
||||||
|
"curl -k --limit-rate 100K https://localhost:443/ -O",
|
||||||
|
"curl -k -OL https://localhost:443/",
|
||||||
|
"curl -k -I --http2 https://localhost:443/",
|
||||||
|
"curl -k -I --http2 -s https://localhost:443/ | grep HTTP",
|
||||||
|
"curl -k https://localhost:443/",
|
||||||
|
"curl -k -L https://localhost:443/",
|
||||||
|
"curl -k -A \"Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0\", https://localhost:443/",
|
||||||
|
"curl -k -u username:password https://localhost:443/",
|
||||||
|
"curl -k -v https://localhost:443/admin.php",
|
||||||
|
"curl -v https://localhost:443",
|
||||||
|
"curl -v https://localhost:443/index.html",
|
||||||
|
"curl -v https://localhost:443",
|
||||||
|
"curl -d 'id=0&name=test' https://localhost:443",
|
||||||
|
"curl -d '{\"id\",:9,\",name\",:\",test\",}' -H 'Content-Type: application/json' https://localhost:443",
|
||||||
|
"curl -X DELETE https://localhost:443/test",
|
||||||
|
"curl -o newfile.tar.gz https://localhost:443/file.tar.gz",
|
||||||
|
"curl -C - -O https://localhost:443/file.tar.gz",
|
||||||
|
"curl -I https://localhost:443/",
|
||||||
|
"curl --data \"firstName=John&lastName=Doe\", https://localhost:443/info.php",
|
||||||
|
"curl -I https://localhost:443 --user-agent \"I am a new web browser\",",
|
||||||
|
"curl --cookie-jar cnncookies.txt https://localhost:443/ -O",
|
||||||
|
"curl --cookie cnncookies.txt https://localhost:443/",
|
||||||
|
"curl -s -O https://localhost:443/",
|
||||||
|
"curl --limit-rate 100K https://localhost:443/ -O",
|
||||||
|
"curl -OL https://localhost:443/",
|
||||||
|
"curl -I --http2 https://localhost:443/",
|
||||||
|
"curl -I --http2 -s https://localhost:443/ | grep HTTP",
|
||||||
|
"curl https://localhost:443/",
|
||||||
|
"curl -L https://localhost:443/",
|
||||||
|
"curl -A \"Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0\" https://localhost:443/",
|
||||||
|
"curl -u username:password https://localhost:443/",
|
||||||
|
"curl -v https://localhost:443/admin.php"
|
||||||
|
]
|
||||||
|
}
|
14
testfiles/test_influx.json
Executable file
14
testfiles/test_influx.json
Executable file
|
@ -0,0 +1,14 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"influx -execute 'SHOW DATABASES'",
|
||||||
|
"influx -execute 'SELECT * FROM \"h2o_feet\" LIMIT 3' -database=\",NOAA_water_database\", -precision=rfc3339 name: h2o_feet",
|
||||||
|
"influx -execute 'SHOW DATABASES'",
|
||||||
|
"influx -execute 'DROP DATABASE \"NOAA_water_database\",'",
|
||||||
|
"influx -execute 'CREATE DATABASE NOAA_water_database'",
|
||||||
|
"curl https://s3.amazonaws.com/noaa.water-database/NOAA_data.txt -o NOAA_data.txt",
|
||||||
|
"influx -import -path=NOAA_data.txt -precision=s -database=NOAA_water_database",
|
||||||
|
"influx -import -path=queries_influx.txt -precision=s"
|
||||||
|
]
|
||||||
|
}
|
35
testfiles/test_knot.json
Executable file
35
testfiles/test_knot.json
Executable file
|
@ -0,0 +1,35 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 1000,
|
||||||
|
"listCommands": [
|
||||||
|
"sudo knotc status",
|
||||||
|
"dig @127.0.0.1 SOA example.com",
|
||||||
|
"sudo knotc zonestatus",
|
||||||
|
"sudo knotc reload ",
|
||||||
|
"sudo knotc stats",
|
||||||
|
"sudo knotc -h",
|
||||||
|
"dig @127.0.0.1 SOA whoami.domain.example",
|
||||||
|
"sudo knotc status ",
|
||||||
|
"sudo knotc",
|
||||||
|
"sudo knotc -h",
|
||||||
|
"sudo knotc stats",
|
||||||
|
"sudo knotc -h",
|
||||||
|
"sudo knotc zone-read example",
|
||||||
|
"sudo knotc zone-read example.com",
|
||||||
|
"sudo knotc zone-refresh example.com",
|
||||||
|
"sudo knotc zone-reload example.com",
|
||||||
|
"sudo knotc zone-get example.com",
|
||||||
|
"sudo knotc zone-stats example.com",
|
||||||
|
"sudo knotc zone-status example.com",
|
||||||
|
"sudo knotc conf-check",
|
||||||
|
"sudo knotc conf-list",
|
||||||
|
"sudo knotc conf-diff",
|
||||||
|
"sudo knotc conf-export test.db",
|
||||||
|
"sudo knotc conf-import test.db",
|
||||||
|
"sudo knotc zone-flush example.com",
|
||||||
|
"sudo knotc zone-status",
|
||||||
|
"sudo knotc zone-check ",
|
||||||
|
"sudo knotc stats",
|
||||||
|
"sudo knotc stop"
|
||||||
|
]
|
||||||
|
}
|
25
testfiles/test_memcached.json
Executable file
25
testfiles/test_memcached.json
Executable file
|
@ -0,0 +1,25 @@
|
||||||
|
{
|
||||||
|
"typeTest": "telnet",
|
||||||
|
"addressTelnet": "127.0.0.1",
|
||||||
|
"portTelnet": 11211,
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"stats",
|
||||||
|
"stats slabs",
|
||||||
|
"stats items",
|
||||||
|
"stats cachedump 1 0",
|
||||||
|
"get testkey",
|
||||||
|
"VALUE testkey 0 9",
|
||||||
|
"test data",
|
||||||
|
"add newkey 0 60 5",
|
||||||
|
"replace key 0 60 5",
|
||||||
|
"prepend key 0 60 15",
|
||||||
|
"stats malloc",
|
||||||
|
"flush_all",
|
||||||
|
"set hello 0 900 9",
|
||||||
|
"get hello",
|
||||||
|
"VALUE world 0 9",
|
||||||
|
"get hello",
|
||||||
|
"delete hello"
|
||||||
|
]
|
||||||
|
}
|
11
testfiles/test_mongodb.json
Normal file
11
testfiles/test_mongodb.json
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 4000,
|
||||||
|
"listCommands": [
|
||||||
|
"mongo -eval \"db.hostInfo()\"",
|
||||||
|
"mongo test_db -eval \"printjson(db.getCollectionNames())\"",
|
||||||
|
"mongo test_db -eval \"db.help()\"",
|
||||||
|
"mongo < queries_mongo1.js",
|
||||||
|
"mongo \"mongodb://myUserAdmin:abc123@localhost/test?authSource=admin\" < queries_mongo2.js"
|
||||||
|
]
|
||||||
|
}
|
8
testfiles/test_mysql.json
Executable file
8
testfiles/test_mysql.json
Executable file
|
@ -0,0 +1,8 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 3000,
|
||||||
|
"listCommands": [
|
||||||
|
"mysql -uroot --execute=\"SHOW DATABASES; \"",
|
||||||
|
"mysql -uroot < /home/gain/go/src/tools/testfiles/queries_mysql.txt"
|
||||||
|
]
|
||||||
|
}
|
15
testfiles/test_postgres.json
Executable file
15
testfiles/test_postgres.json
Executable file
|
@ -0,0 +1,15 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\conninfo\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\l\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\dt\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\dn\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\df\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\dv\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"\du\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -c \"SELECT version();\"",
|
||||||
|
"PGPASSWORD=postgres psql -U postgres -a -w -f queries_postgres.sql"
|
||||||
|
]
|
||||||
|
}
|
31
testfiles/test_redis.json
Executable file
31
testfiles/test_redis.json
Executable file
|
@ -0,0 +1,31 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 1500,
|
||||||
|
"listCommands": [
|
||||||
|
"redis-cli flushall",
|
||||||
|
"redis-cli ping",
|
||||||
|
"redis-cli -n 1 incr a",
|
||||||
|
"redis-cli -n 1 incr a",
|
||||||
|
"redis-cli set foo bar",
|
||||||
|
"redis-cli get foo",
|
||||||
|
"redis-cli incr mycounter",
|
||||||
|
"redis-cli incr mycounter",
|
||||||
|
"redis-cli -r 100 incr foo",
|
||||||
|
"redis-cli lpush mylist a b c d",
|
||||||
|
"redis-cli --csv lrange mylist 0 -1",
|
||||||
|
"redis-cli --eval /tmp/script.lua foo , bar",
|
||||||
|
"redis-cli -i 1 INFO | grep rss_human",
|
||||||
|
"redis-cli select 2",
|
||||||
|
"redis-cli dbsize",
|
||||||
|
"redis-cli select 0",
|
||||||
|
"redis-cli dbsize",
|
||||||
|
"redis-cli --bigkeys",
|
||||||
|
"redis-cli --scan | head -10",
|
||||||
|
"redis-cli --scan --pattern '*-11*'",
|
||||||
|
"redis-cli --scan --pattern 'user:*' | wc -l",
|
||||||
|
"redis-cli --intrinsic-latency 5",
|
||||||
|
"redis-cli --rdb /tmp/dump.rdb",
|
||||||
|
"redis-cli --slave",
|
||||||
|
"redis-cli --latency"
|
||||||
|
]
|
||||||
|
}
|
57
testfiles/test_sqlite.json
Normal file
57
testfiles/test_sqlite.json
Normal file
|
@ -0,0 +1,57 @@
|
||||||
|
{
|
||||||
|
"typeTest": "stdin",
|
||||||
|
"timeMsCommand": 1000,
|
||||||
|
"listCommands": [
|
||||||
|
"create table tbl1(one varchar(10), two smallint);",
|
||||||
|
"insert into tbl1 values('hello!',10);",
|
||||||
|
"insert into tbl1 values('goodbye', 20);",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode list",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".separator ",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode quote",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode line",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode column",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".width 12 6",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".header off",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode insert new_table",
|
||||||
|
"select * from tbl1;",
|
||||||
|
".mode list",
|
||||||
|
".separator |",
|
||||||
|
".output test_file_1.txt",
|
||||||
|
"select * from tbl1;",
|
||||||
|
"SELECT * FROM tbl1;",
|
||||||
|
".once -x",
|
||||||
|
"SELECT * FROM tbl1;",
|
||||||
|
"CREATE TABLE images(name TEXT, type TEXT, img BLOB);",
|
||||||
|
"UPDATE docs SET body=edit(body) WHERE name='report-15';",
|
||||||
|
"UPDATE pics SET img=edit(img,'gimp') WHERE id='pic-1542';",
|
||||||
|
"SELECT length(edit(img,'gimp')) WHERE id='pic-1542';",
|
||||||
|
".header on",
|
||||||
|
".mode csv",
|
||||||
|
"SELECT * FROM tbl1;",
|
||||||
|
".system export.csv",
|
||||||
|
"create table tab1(one varchar(10), two smallint);",
|
||||||
|
".import export.csv tab1",
|
||||||
|
".save ex1.db",
|
||||||
|
".help",
|
||||||
|
"CREATE TABLE selftest(tno INTEGER PRIMARY KEY, op TEXT, cmd TEXT, ans TEXT);",
|
||||||
|
"CREATE TABLE tbl2 (f1 varchar(30) primary key,f2 text,f3 real);",
|
||||||
|
"insert into tbl1 values(10, 'hello!',10);",
|
||||||
|
"insert into tbl2 values(10, 'hello!',10);",
|
||||||
|
"UPDATE tbl2 SET f2=\",salut\", WHERE f2='hello!';",
|
||||||
|
"select * from tbl2;",
|
||||||
|
"UPDATE tbl2 SET text=edit(text, \"salut\",) WHERE text='hello!';",
|
||||||
|
"select * from tbl2;",
|
||||||
|
"DELETE tbl2 WHERE f2=\"salut\";",
|
||||||
|
"DELETE FROM tbl2 WHERE f2=\"salut\",;",
|
||||||
|
"DROP TABLE tbl2;",
|
||||||
|
".quit"
|
||||||
|
]
|
||||||
|
}
|
33
testfiles/text_exim.json
Executable file
33
testfiles/text_exim.json
Executable file
|
@ -0,0 +1,33 @@
|
||||||
|
{
|
||||||
|
"typeTest": "exec",
|
||||||
|
"timeMsCommand": 2000,
|
||||||
|
"listCommands": [
|
||||||
|
"sudo exiwhat",
|
||||||
|
"sudo exim -bP",
|
||||||
|
"sudo mailq",
|
||||||
|
"sudo exim -bpc",
|
||||||
|
"sudo exim -bp | exiqsumm",
|
||||||
|
"sudo exim -bt monmail@domaine.fr",
|
||||||
|
"sudo exiqgrep -f [user]@domaine",
|
||||||
|
"sudo exiqgrep -r [user]@domaine",
|
||||||
|
"sudo exiqgrep -o 120",
|
||||||
|
"sudo exiqgrep -y 120",
|
||||||
|
"sudo exim -Mf message-id",
|
||||||
|
"sudo exim -Mt message-id",
|
||||||
|
"sudo exim -M message-id",
|
||||||
|
"sudo exim -Mvl message-id",
|
||||||
|
"sudo exim -Mvh message-id",
|
||||||
|
"sudo exim -Mvb message-id",
|
||||||
|
"sudo exim -Mrm message-id",
|
||||||
|
"sudo exim -qf",
|
||||||
|
"sudo exim -qff",
|
||||||
|
"sudo exim -Mes message-id address",
|
||||||
|
"sudo exim -bpr | grep -Eo \"<[^ ]*@[^ ]*>\", | sort | uniq -c",
|
||||||
|
"sudo exim -bpr | grep -Eo \"^\s*[^ ]*@[^ ]*$\", | sort | uniq -c",
|
||||||
|
"sudo exiqgrep -o 43000 -i | xargs exim -Mrm",
|
||||||
|
"sudo exiqgrep -z -i | xargs exim -Mrm",
|
||||||
|
"sudo exiqgrep -i -f [user]@domaine | xargs exim -Mrm",
|
||||||
|
"sudo exiqgrep -o 43000 -i -f [user]@domaine | xargs exim -Mrm",
|
||||||
|
"sudo grep -lr 'bla bla bla' /var/spool/exim/input/ | sed -e 's/^.*\/\([a-zA-Z0-9-]*\)-[DH]$/\1/g' | xargs exim -Mrm"
|
||||||
|
]
|
||||||
|
}
|
Loading…
Add table
Reference in a new issue