-

This site is deprecated and will be decommissioned shortly. For current information regarding HPC visit our new site: hpc.njit.edu

ResearchOenHouseSep2017

From NJIT-ARCS HPC Wiki
Jump to: navigation, search


Overview of Shared and Dedicated Central Computational Resources for Researchers

Tartan Initiative

Provide a high perfromance computing (HPC} & big data (BD) infrastructure so that

  • Junior faculty can establish a research record that enables them to secure funding
  • Established researchers can use funding to purchase dedicated HPC resources

Tartan Initiative

Researcher Services

Researchers using resources in the Tartan central shared infrastructure are allocated certain base resources.

  • Dedicated resources that are needed by researchers beyond the base allocations must be acquired by the researcher.
  • Researchers must also acquire specialized resources that they need but which are lacking in the Tartan infrastructure.
  • Resources acquired by researchers are dedicated to those researchers.

Researcher Services

Computational hardware in the Kong HPC cluster

Examples:Purchase of dedicated computational resources:

  • Chemical Biological and Pharmaceutical Engineering researcher
    • Kong CPUs, InfiniBand node interconnect, rack, power disrtibution unit (PDU). Funded by startup package
  • Computer Science Bioinformatics Data Science program
    • Kong CPUs and GPUs, rack, node interconnect switch, PDU, 21TB disk. Funded by university funds

Disk and backup

Examples: Purchase of dedicated disk resources

  • Civil & Environmental Engineering, Transportation
    • 9 TB disk + backup for for development Oracle server. Funded by external grant
  • Mechanical and Industrial Engineering researcher
    • 6 TB disk + backup for HPC computations. Funded by startup package
  • Biomedical Engineering researcher
    • 6 TB disk + backup for HPC computations .Funded by external grant

Disk and backup cost schedule

Computational hardware in the GITC 4320 datacenter

There are cases where, due to the nature of the research, hardware cannot be part of an HPC cluster (Kong (general-use) or Stheno (DMS))

To accommodate such cases, IST provides space in the GITC 4320 datacenter.

The current and near-term occupants of this space are :

  • Computer Science : 9 units of various configurations
  • Physics : 1 server, 50TB disk
  • Center for Natural Resources Development and Protection (CNRDP) : 1 server

HPC Software

The user's environment to run software is set up using the "module" command See UserEnvironment#Modules. Most of the available software has a corresponding module file, listed in this table.

Much of the software listed is stored in the AFS distributed file system, which means that that software is available on Linux AFS clients in addition to the HPC clusters.

Requests for software installations should be sent to : arcs@njit.edu

Big Data (BD)

The shared hardware resource Horton.njit.edu supports both Hadoop and Spark environments for working with BD. See Hadoop_Overview.

HTCondor

HTCondor is workload management software that uses spare cycles on Linux computers to perform computationally intensive work. See HTCondor

Consultation

Consultation on HPC and BD matters is available by contacting arcs@njit.edu

Purchasing dedicated resources : off-premise

There are instances in which the best option for a researcher is to employ Off-premise resources.

Cost schedules vary for such resources.