您要查找的是不是:
- huge data modeling 海量数据建模
- Class is based on the XPath data model. 类基于XPath数据模型。
- A data model is a plan for building a database. 一个资料模型就是对建立一个资料库的规划。
- Data mining is a process of inferring knowledge from such huge data. 数据挖掘就是从海量数据中推断出知识的过程。
- And this data model is described in XML DTD. 本文用统一建模语言描述了整个数据模型,给出了数据模型中几种重要类的文档类型定义。
- This is leading much computing to migrate back into huge data centres. 由此导致大量计算融回到巨型数据中心时代。
- Its data model is the basis of DW of an ISC. 三是集成供应链节点企业数据库,并形成为集成供应链信息平台数据仓库提供基础数据的企业数据模型。
- Creates a new property for the built-in data model. 为内部数据模型创建一个新的属性.
- In data modeling and analysis, finite mixture is widely used. 摘要在数据建模和分析中,有限混合体模型被广泛地使用着。
- Researcher 7 realize before New Year, existing network processes the huge data that LHC generates without ability, must build freeboard fast network. 研究人员7年前意识到,现有网络没有能力处理LHC产生的海量数据,必须建立超高速网络。
- The Order could be the root of another hierarchy in the data model. 订单可能是数据模型中的其他层次的根。
- The economists looked at a sample of mortgages in a huge data set that covers 60% of America's residential-mortgage market. 从涵盖了美国60%25的住宅抵押市场庞大的数据库背景里,经济学家查看了一份抵押贷款案例。
- The Catabase data model structure is shown in Figure 4. 如图4所示是该Catabase数据模型的结构。
- For the problem of huge data storage, use SDRAM to store the data, and use FPGA to control SDRAM to complete the transpose. 2.;为了解决数据量庞大、高速数据存取的问题;本文采用了FPGA芯片和同步动态存储器(SDRAM)来实现矩阵转置运算。
- This SQL script will create the data model used in our example. 这个SQL脚本可以创建我们示例中使用的数据模型。
- As genomics associated with huge data, bioinformatics plays an important role in these processes of data production, data management and data mining. 基因组学研究中海量数据的存储、管理和检索,以及对这些数据进行挖掘等过程,必须借助于生物信息学的方法。
- Integration takes place in the user interface and the data model. 集成发生在用户界面和数据模型中。
- This logical data model is called a data source view. 此逻辑数据模型称为数据源视图。
- As the popularization of the internet and the quick increase of information, it is more difficult to find the useful knowledge from huge data sets. 摘要随着网络的普及和信息量的急剧增加,从海量数据中提取有用的数据信息已迫在眉睫。