GithubHelp home page GithubHelp logo

kumes / aggraphsearch Goto Github PK

View Code? Open in Web Editor NEW
4.0 1.0 0.0 15.7 MB

agGraphSearch: a R package for searching RDF graph data.

Home Page: https://kumes.github.io/agGraphSearch/

R 5.43% Shell 0.01% HTML 94.55% Dockerfile 0.01%
sparql sparql-query sparql-endpoints wikidata graph-data

aggraphsearch's Introduction

agGraphSearch (ver 0.99.x)

Introduction

agGraphSearch package supplies a tool-set for searching graph structures based on RDF (Resource Description Framework) and extracting the subset of the class-related hierarchy with domain specific terms.

This set of functions also allows us to explore the triple-like formatted dataset without creating any SPARQL queries on R scripts.

See the workflow submitted to IJCKG2021

Installation

  1. Start R.app

  2. Run the following commands in the R console.

#install SPARQL_1.16
URL <- "https://cran.r-project.org/src/contrib/Archive/SPARQL/SPARQL_1.16.tar.gz"
install.packages(URL, repos=NULL, type="source")
library(SPARQL)

#install agGraphSearch
install.packages( "devtools" )
devtools::install_github( "kumeS/agGraphSearch" , force = TRUE)
library( "agGraphSearch" )

Tutorial/workflow

Some examples of function execution in the package

  • CkeckQuery_agCount_Label_Num_Wikidata_P279_P31

  • agCount_Label_Num_Wikidata_P279_P31 & agTableDT

  • CkeckQuery_agWD_Alt_Wikidata

  • CkeckQuery_agQIDtoLabel_Wikidata

  • CkeckQuery_agCount_ID_Num_Wikidata_QID_P279_P31

SPARQL endpoints

Optional setting at OECU

Run the following commands in the R console.

#Proxy setting at OECU
proxy_url = "http://wwwproxy.osakac.ac.jp:8080"
Sys.setenv("http_proxy" = proxy_url)
Sys.setenv("https_proxy" = proxy_url)
Sys.setenv("ftp_proxy" = proxy_url)

#Test 1
curlGetHeaders("http://www.google.com/")
# OK if the optput is 200
install.packages("readr")
# OK if it is installed successfully.

#Test 2
install.packages("BiocManager")
BiocManager::install("BiocStyle")
install.packages( "devtools" )
devtools::install_github( "kumeS/agGraphSearch" )
library( "agGraphSearch" )

#Test 3
#system( "git clone https://github.com/kumeS/agGraphSearch.git" )

Author / maintainer

  • Satoshi Kume

License

Artistic License 2.0.

Citation

@inproceedings{10.1145/3502223.3502227,
author = {Kume, Satoshi and Kozaki, Kouji},
title = {Extracting Domain-Specific Concepts from Large-Scale Linked Open Data},
year = {2022},
isbn = {9781450395656},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3502223.3502227},
doi = {10.1145/3502223.3502227},
abstract = {We propose a methodology for extracting concepts for a target domain from large-scale linked open data (LOD) to support the construction of domain ontologies providing field-specific knowledge and definitions. The proposed method defines search entities by linking the LOD vocabulary with technical terms related to the target domain. The search entities are then used as a starting point for obtaining upper-level concepts in the LOD, and the occurrences of common upper-level entities and the chain-of-path relationships are examined to determine the range of conceptual connections in the target domain. A technical dictionary index and natural language processing are used to evaluate whether the extracted concepts cover the domain. As an example of extracting a class hierarchy from LOD, we used Wikidata to construct a domain ontology for polymer materials and physical properties. The proposed method can be applied to general datasets with class hierarchies, and it allows ontology developers to create an initial model of the domain ontology for their own purposes.},
booktitle = {The 10th International Joint Conference on Knowledge Graphs},
pages = {28–37},
numpages = {10},
keywords = {Linked open data, Ontology construction, Domain ontology, Wikidata, Graph analysis},
location = {Virtual Event, Thailand},
series = {IJCKG'21}
}

aggraphsearch's People

Contributors

kumes avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar

aggraphsearch's Issues

Error: object 'quartz' is not exported by 'namespace:grDevices'

At installing 'agGraphSearch', 'Error: object 'quartz' is not exported by 'namespace:grDevices'' occured.
I think 'quartz' is package only for macOS. But I would like install it in Windows 10(64bit) environment.
R_quartz_error

Let me know how to resolve the issue.

Thank you for your kind.

Issues occured in "Short tutorial: a workflow to use agGraphSearch for leukemia terms"

@kumeS san, I resume a tutorial titled above.
In section 3.5.2 'Individual network diagrams', the programme shows a message below.

Warning message in dir.create(outputDir):
"cannot create dir 'agVisNetwork_急性リンパ性白血病.wd:Q180664_221209_files', reason 'Invalid argument'"
Warning message in dir.create(target_dir):
"cannot create dir 'agVisNetwork_急性リンパ性白血病.wd:Q180664_221209_files\htmlwidgets-1.5.4', reason 'Invalid argument'"
Error in normalizePath(path.expand(path), winslash, mustWork): path[1]="agVisNetwork_急性リンパ性白血病.wd:Q180664_221209_files/htmlwidgets-1.5.4": ファイル名、ディレクトリ名、またはボリューム ラベルの構文が間違っています。
Traceback:

  1. agVisNetwork(Graph = res5[[n]], Selected = Lab00, Browse = FALSE,
    . Output = TRUE, FilePath = FileName)
  2. VIS %>% networkD3::saveNetwork(file = FilePath)
  3. networkD3::saveNetwork(., file = FilePath)
  4. htmlwidgets::saveWidget(network, file, selfcontained)
  5. pandoc_save_markdown(html, file = file, libdir = libdir, background = background,
    . title = title)
  6. lapply(rendered$dependencies, function(dep) {
    . dep <- htmltools::copyDependencyToDir(dep, libdir, FALSE)
    . dep <- htmltools::makeDependencyRelative(dep, dir, FALSE)
    . dep
    . })
  7. FUN(X[[i]], ...)
  8. htmltools::copyDependencyToDir(dep, libdir, FALSE)
  9. normalizePath(target_dir, "/", TRUE)

Well, I think that in Windows OS, ":" could not be used in directory or folder or file name.
(so that, in macOS, the programme is done. But another [WARNING] messages appear. That is "Deprecated: --self-contained. use --embed-resources --standalone".)

Then, could you please fix the probelms.

Thank you for your kindness.

Can I get datasets from Google Drive?

In this time, I try to learn a usage of the package in the tutorial ([Wikidata] agGraphSearch tutorial: A workflow to use agGraphSearch and Wikidata with PolyInfo terms).
But I can't get datasets from Google Drive where you own.
Then, if you allow me for using datasets above, please put them in the drive.

Thank you for your kind.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.