LB management replies
Question Description
Reply to all parts with 150 words a piece
Part 1
The fundamental assets of a national scientific research laboratory align with one of four major areas. Professional staff, or the people component, includes the knowledge workers that hold the domain expertise and technical skills to perform a specific function or branch of research. The facilities, or space component, comprises the physical laboratory capable of hosting specific research studies. The scientific equipment includes specialized technology, tools, and devices not readily available elsewhere. Lastly, projects comprise scientific research, studies, and deliverables that demonstrate current usage, need, and success in utilizing the other three assets.
While this domain might appear trivial, measuring and understanding the assets in determining a laboratory’s integrated capability to perform scientific research becomes difficult, especially in near real-time. Like any commodity, the fundamental assets of a laboratory regularly change and measuring accurate availability at present as well as into the future is an analytical challenge.
National research laboratories exist to address emerging challenges in science at both the national and global level. As these needs constantly shift and increase in complexity, the specific capabilities of the laboratories must also conform accordingly. In order to align a laboratory’s capabilities to emerging challenges requires understanding precisely what assets are available and how they can be integrated to address current and future needs.
A study designed to evaluate and predict the likelihood that a laboratory has the necessary assets to address a specific research problem is a first step in determining the feasibility of essential integrated capability management. Upon success, the next incremental step would include developing a consistent model for evaluation across other domains and emerging scientific challenges. The intended outcome not only maximizes usage of laboratory assets but also analyzes disparate datasets to uncover capabilities that are not obvious or trivial. The goal of developing a formal Integrated Capability Management (ICM) model provides decision-makers with a data-driven approach to aligning capabilities with emerging challenges and identifying which asset sets would gain the most through further investments.
The purpose of this practical research effort is to gain a better understanding of current capabilities and predict how well these capabilities align with emerging challenges across scientific domains. The nature of the research process is focused on known variables, guidelines, and is not bound to any specific context. The underlying datasets are numerical with significant volume aligning nicely towards the advantages of machine learning and artificial intelligence algorithms. Data-driven insights are communicated through visualization tools enabling ad-hoc aggregation and online analytical processing. This traits of the research align with the characteristics of a quantitative research approach (Leedy & Ormrod, 2016) .
References
Leedy, P. D., & Ormrod, J. E. (2016). Practical Research: Planning and Design. Boston, MA: Pearson Education, Inc.
Part 2
When conducting research in the field of cybersecurity, it is important to focus on a specific topic and determine a plan on how the research will be conducted. Leedy and Ormrod (2016) demonstrate how to determine the best practical application for the research being conducted. Both qualitative or quantitative research methods could be used when conducting research within the cybersecurity field; however, the best approach for my research will be to use a quantitative study in order to gather technical data related to non-medical implantable technology and how they interact with the human body. Gathering the required data can be achieved by performing action-based research (Leedy & Ormrod, 2016).
A quantitative study would be best for non-medical implantable research given that the different aspects of implantable technology can be measured. As an example, the different types of measurements collected in this research would be the various depths under the skin the implants can be placed and what orientation the implant is placed in relative to the surface of the skin. I will be looking at how the two items of location and orientation can be used to affect the security aspects of an implanted device. Using an action-based research approach will allow me to examine how different types of wireless signals travel through the human body and how different implant locations might influence the various signals. The study could be carried out using an analog human substitute made from medical grade gel, which simulates human tissue.
Through the work I have already done on my conceptual framework has shown that there is room for both quantitative and qualitative, I believe it is best to focus my research on the quantitative side. I also base the decision for using a quantitative study on feedback received that our research topic should be narrow in focus. For the sake of my dissertation, I will be only focusing on measurements I could reliably retrieve from a quantitative study.
References
Leedy P. & Ormrod J. (2016). Ch 1 The Nature and Tools of Research, Ch 4 Planning Your Research Project. Practical Research: Planning and Design, 11e (1-115).
Part 3
The United States Department of Energy (DOE) collection of national laboratories consists of seventeen independent research facilities, all with a common objective to advance security and prosperity at both the national and global levels. Through extensive research and technology advancement, problems across sectors such as energy, environmental management, molecular science, and national security address the core of the DOE mission. The mission areas consist of seventeen core capabilities, which uniquely identify sectors for research, discovery, and innovation. A laboratory must include the appropriate professional staff, equipment, laboratory space, and experience in addressing core capabilities.
The national laboratory system today cannot rapidly determine if the availability of resources and depth of experience and knowledge contained within a laboratory is enough to address a core capability. Decisions towards accepting a challenging problem rely on tribal knowledge of leadership, rather than data-driven insights based on previous efforts. The lack of data-driven insights leads to an inefficient alignment with specific projects, delays based on lack of professional staff, equipment, or lab space, and setbacks due to lack of experience in the given domain. Without fact-based awareness, identification of gaps in the workforce, prioritized investments in equipment and laboratory space, and failure to apply expertise are all factors contributing to opportunity loss.
The purpose of researching capability management at national laboratories is to determine if a data-driven integrated capability model is capable of drastically improving the impact and notable outcomes of work performed within the laboratories. While significant literature exists towards capability management across various corporate domains, the problem within the national laboratory system is unique, given the multidisciplinary research areas and subsequent needs. Combining different capabilities based on an advanced model utilizing real-time data is a Big Data Analytics challenge, which invites review of existing literature in capability management, knowledge management, a quantitative research method, concluding with a tested model with the ability to scale.
An essential key takeaway from this assignment is the need to establish subproblems from the original problem scope (Leedy & Ormrod, 2016) . Each subproblem requires the ability for research to occur independently, where solutions aggregate to the problem objective (Leedy & Ormrod, 2016) . While autonomously researchable, a subproblem must contribute to the entirety of the main problem and not extend beyond that (Leedy & Ormrod, 2016) . Lastly, subproblems must be a small finite set, usually two to six, where a more substantial domain forces re-evaluation of the original problem size (Leedy & Ormrod, 2016) .
References
Leedy, P. D., & Ormrod, J. E. (2016). Practical Research: Planning and Design. Boston, MA: Pearson Education, Inc.
Part 4
With the escalation of cyber incidents and threats, the deficiencies in the available number and adequately trained cybersecurity professionals prove challenging for organizations to defend and mitigate cyber-attacks (Chadha, 2018) . A 2017 study conducted by the Center for Cyber Safety and Education estimated over 300,000 currently unfilled cybersecurity positions in the United States alone and a projected deficiency of 1.8 million cybersecurity workers world-wide by 2022 (Chadha, 2018).
At the onset of my doctoral journey, the identified problem in my proposed study centered on the cause of the emerging gap in cybersecurity and determine the required skillsets of current and prospective cyber professionals to reduce the workforce shortage. In refining the problem, and in accordance with Feldman’s (2004) approach to research, the scope of my initial problem proved too broad in nature; therefore, the method focused on a more refined contextual environment. The identified problem evolved into a proposed study on the strategies academic institutions and industries need to employ to improve the skills of current and prospective cybersecurity professionals in the Colorado Springs federal sector.
The purpose of the study is to explore in a qualitative manner the prospective strategies academic institutions and federal organizations need to collaboratively improve the current skills and reduce the emerging gap of cybersecurity professionals in the Colorado Springs federal labor market. The heart of the identified problem must be decomposed into sub-problems that contribute to the nature of the overall problem (Leedy & Ormrod, 2016).
In the development of the conceptual framework, an investigation of current cybersecurity training programs needs to be conducted to assess the alignment with current industry workforce needs. Additionally, industry needs to collaborate with academic institutions to socialize the growing and evolving knowledge, skills, and abilities of current and future cyber workforce demands. While immediate impacts on cybersecurity workforce shortages reside with post-secondary institutions, an introduction of science, technology, engineering, and mathematics programs at the elementary level may have profound impact on the long-term future of cyber professionals.
References
Chadha, J. (2018). Three Ideas for Solving the Cybersecurity Skills Gap; One possibility: Create a Cybersecurity Peace Corps. Wall Street Journal (Online). Retrieved from https://proxy.cecybrary.com/login?url=https://search.proquest.com/docview/2108677666?accountid=144789
Feldman, D. C. (2004). What are we talking about when we talk about theory? Journal of Management, 30 (5), 565–567. Doctoral Library-SAGE.
Leedy, P. D., & Ormrod, J. E. (2016). Practical Research: Planning and Design . Boston, MA: Pearson Education, Inc.
Part 5
Theory in research plays an important role to help make sure that the outcome is linked to the fundamental research area (Feldman, 2004) . Using theory is a way to guide the research and to help answer a generalized statement that links two or more facts (Sunday, 2015) . It is important to confuse a hypothesis, which is used to predict how two or more items could interact, with a theory which is a broader thought (Sunday, 2015) . When writing a research paper, it is important to fully understand what has already been researched in your area and how your research will work to answer or solve a theory (Feldman, 2004) . Feldman (2004) states that theory can be preserved differently by different scholars, and therefore any research should be written clearly to show mastery of the field being researched.
The research topic of non-medical implantable technology has not been explored in depth. Therefore, I will be trying to answer the question can an implantable chip be used to replace a card-based technology, such as a credit card or drivers license. Since this field has not been explored, my research will be based on the adoption of the different types of wearable technology we have already adopted in our daily lives (Liberati, 2018) . Combined with research around medical implants and how the wireless transmissions interact with the human body (Moradi, Sydänheimo, Bova, & Ukkonen, 2017) .
The fundamental areas have already been researched; however, I am unable to find someone looking to see if an implantable chip could safely and securely replace our existing card-based systems. Even within this narrow focus, there are still multiple different areas that could be explored. I will be focusing my research on the security aspect related to the data that would be stored on the chip and how it could be used or compromised.
References
Feldman, D. C. (2004). What are we talking about when we talk about theory? Journal of Management, 30(5), 565-567. doi:10.1016/j.jm.2004.05.001
Liberati, N. (2018). The Borg–eye and the We–I. The production of a collective living body through wearable computers. AI & Society, 1-11. doi:10.1007/s00146-018-0840-x
Moradi, E., Sydänheimo, L., Bova, G. S., & Ukkonen, L. (2017). Measurement of wireless power transfer to deep-tissue ID-based implants using wireless repeater node. IEEE Antennas and Wireless Propagation Letters, 16, 2171-2174. doi:10.1109/LAWP.2017.2702757
Sunday, C. E. (2015). The role of theory in research. 1-21. Retrieved from https://www.uwc.ac.za/Students/Postgraduate/Docume… role of theory in research.pdf
Part 6
Research projects can take on theoretical approach and enhance the academic conceptualization on a topic, while applied research provides immediate impact and relevancy to current practices and procedures (Leedy & Ormrod, 2015). Theory in the research process seeks to outline a specific variable’s influence in a defined contextual environment (Feldman, 2004). Feldman (2004) identified the daunting task of clearly and accurately defined hypotheses through mastery of previous research as contributing factors for developing theory in research. Ultimately, the acceptance of an established theory in research may differ from one researcher to another (Feldman, 2004).
As stated by Feldman(2004), the lack of previous research does not warrant or justify embarking upon the research in question. From the perspective of the perceived shortage in current cybersecurity professionals and an emerging gap in the skillsets of today’s cyber workforce, theoretical consideration regarding specific factors impacting a small subset of the cyber workforce may prove beneficial to the overall problem. As a cyber professional in the federal Colorado Springs labor market, a study of regional and environmental constraints impacting the local area may provide a theoretical contribution to the research. As with many government positions, the availability of a “cleared” workforce coupled with the required skillsets for cyber-related professions may be a limiting factor for entry into the cyber workforce while others who have the requisite skills but do not have the eligibility to obtain a government clearance could impact the employment of a readily available workforce.
From the perspective of the required skills, the efficacy of available training programs and academic resources may prove a contributing factor to the regional deficiency in a readily available cyber workforce. As an instructor in the educational sector, the importance of a relevant and exhaustive program designed to train and prepare the next generation of cyber professionals requires significant consideration in the alignment with the professional needs of the federal Colorado Springs cyber workforce.
Lastly, another potential theoretical research area for consideration centers on an introduction and socialization of career opportunities in cyber. While the prospects of a cybersecurity and information technology career often do not become present until the high school or collegiate level, the introduction and use of information technology have been found at the elementary school levels. An introduction of the foundational cybersecurity skills at the elementary level through exposure to science, technology, engineering, and mathematical (STEM) programs may increase interest in STEM-related careers, to include, but not limited to, cybersecurity.
Feldman, D. C. (2004). What are we talking about when we talk about theory? Journal of Management, 30(5), 565–567. Doctoral Library-SAGE. Retrieved from http://jom.sagepub.com.proxy.cecybrary.com/content/30/5/565.full.pdf+html
Leedy, P. D., & Ormrod, J. E. (2015). Practical research planning and design (11th ed.). Upper Saddle River, NJ: Pearson.
Part 7
In the past, researchers have tried to build a connection between communities that exist online and those that exist in the real world, making the assumption that the online community is based on physical location similarly to a real-world community (Seungahn, 2010) . Seungahn (2010) discusses the idea of a theoretical framework that is designed to help better understand how virtual and real communities are connected. This theoretical framework explores the idea of virtual commons, which can be thought of as something the entire community utilizes that has a level of value (Seungahn, 2010) . The commons are also owned equally by each member of the virtual community, similarly to a common area in an apartment or condominium complex.
To help make sure the virtual commons would function the same as a physical public area, the Internet would need to be independent and free from the influence of marketing. The Internet should also allow people to engage in discussions with members of the government and encourage conversations in order to help foster change and collaborative actions. Unfortunately, the Internet is far from perfect in offering virtual commons that can be used in the same way as physical commons. The Internet does offer different types of technology that can be used for communication such as bulletin boards, online chatrooms, and newsgroups which help foster communication between members of the virtual community (Seungahn, 2010) .
A similar approach in developing a theoretical framework on how end users have accepted medical implantable and wearable devices could be used to show how, in the near future, they may begin to accept non-medical implantable technology. Using such a framework could aid in reducing the scope of my research to only address specific security concerns. My research could focus on the security concerns that cause users to be apprehensive about non-medical technology, such as data protection and privacy.
References
Seungahn, N. A. H. (2010). A theoretical and analytical framework toward networked communities. Teoretski in analitični okvir razvoja omrežnih skupnosti primer informacijskega zbora elektronskih skupnosti, 17(1), 23-36. doi:10.1080/13183222.2010.1100902
Part 8
The article, “Diffusion of Innovations and the Theory of Planned Behavior in Information Systems Research: A Metaanalysis” by Weigel, Hazen, Cegielski, and Hall is the subject of this post. The theoretical topic of the article resides on the hypothesis that the field of information systems has matured beyond the findings of Tomatzky and Klein’s 1982 meta-analysis of innovation characteristics. The authors combine two historically disparate models, Diffusion of Innovations and Theory of Planned Behavior to create a new blended model, innovation adoption-behavior (IAB). The author’s approach includes a meta-analytic examination of literature, using Tomatzky and Klein’s outcomes as a starting point, description of the quantitative method, results analysis and findings, and next steps towards further research (Weigel, Hazen, Cegielski, & Hall, 2014) .
The Diffusion of Innovations Theory investigates the rate in which new ideas in technology propagate through a social system and at what point the innovation reaches self-sustainment. The key elements included in diffusion research include innovation, adopters, communication channels, time, and a social system (Rogers, 1983) . The Theory of Planned Behavior is a psychological concept referring to the relationship between an individual’s opinions and behavior. The theory aims to improve the ability to predict based on the theory of reasoned action by leveraging the perception of behavioral control (Ajzen, 1991) .
The authors performed a meta-analysis on a domain of fifty-eight carefully selected articles, utilizing proven methods designed for such research. The categories were weighted based on the past methods and qualitatively executed in producing outcomes classified in parallel to Tomatzky and Klein’s method (Weigel et al., 2014) . Tomatzky and Klien defined three research needs the authors directly responded to; more and better research, inclusion of additional independent variables, and reduction of innovation attributes to only a critical few. The authors satisfied the first need directly and met the requirements of the second and third by applying the IAB (Weigel et al., 2014) .
The key learning from this activity is the method in which the authors leveraged two well-substantiated theories into a new innovative model to extend the work of Tomatzky and Klien. This approach is intriguing and has the potential to apply to the doctoral research problem area. Doctoral candidates can practice combining previous proven theoretical work into a new solution or model that exercises it through quantitative or qualitative research methods. The integration of proven theoretical models contributes significantly to the production of cumulative knowledge (Weigel et al., 2014) .
References
Ajzen, I. (1991). The Theory of Planned Behavior”. Organizational Behavior and Human Decision Processes, 50(2), 179-211. doi:10.1016/0749-5978(91)90020-T
Rogers, E. (1983). Diffusion of Innovations (3rd ed.). New York: Free Press of Glencoe.
Weigel, F. K., Hazen, B. T., Cegielski, C. G., & Hall, D. J. (2014). Diffusion of Innovations and the Theory of Planned Behavior in Information Systems Research: A Metaanalysis. Communications of the Association for Information Systems, 34(31). doi:10.17705/1CAIS.03431
Part 9
Richard Bellman, the pioneer of dynamic programming, developed the curse of dimensionality theory while working in the context of data optimization (Bellman, 1957; Lee & Yoon, 2017). This theory applies to settings with high dimensional datasets where the amount of column attributes disproportionally outnumber the amount of observed rows (Lee & Yoon, 2017; Mirza et al., 2019). A dataset that contains DNA, RNA, and other gene expression measurements for a small sample of cancer patients is one example (Mirza et al., 2019). The potential issues with this type of dataset structure, as the curse of dimensionality posits, is that the amount of statistical challenges exponentially (not linearly) increase as a function of the total number of variables considered in the analysis (Rosenstrom et al., 2018). This often affects the integrity of the applied statistical technique and also has serious analytical consequences such as model complexity and overfitting, multicollinearity, and data sparsity (Lee & Yoon, 2017; Mirza et al., 2019). Acknowledging the curse of dimensionality, researchers in the high dimensional dataset community try to overcome the challenges associated with this phenomenon through proposed solutions such as data reduction, data projection, or modifications to traditional statistical methods (Lee & Yoon, 2017; Mirza et al., 2019).
The relationship between certain personality disorders and the onset and/or recurrence of Alcohol Use Disorder was investigated via a prospective observational Norwegian twin study that ranged from 1999-2011 (Rosenstrom et al., 2018). Because of the high number of collected behavioral traits, the data analysis was noted to suffer from complexities inherent to the curse of dimensionality (Rosenstrom et al., 2018). In this specific study, this theory drove the choice of analysis technique (Rosenstrom et al., 2018). Instead of selecting the more traditional stepwise-regression model, the Elastic Net (EN) method was used (Rosenstrom et al., 2018). The EN method, developed to enhance classical regression techniques for high dimensional datasets, works by automatically dropping unnecessary predictor variables from the model (Rosenstrom et al., 2018). The robustness of the EN-based results was tested against another type of variable-selection strategy (Rosenstrom et al., 2018). The similarity of the conclusions provided empirical support of the EN method as an option for handling high dimensional challenges (Rosenstrom et al., 2018).
Because medical data often suffers from issues associated with the curse of dimensionality, my dissertation research involving clinical trial data will need to address this complexity (Mirza et al., 2019). The multiple comparison aspect of adaptive designs is in itself a problem of high dimensions, as every hypothesis adds a new dimension to the probability space. The computational demands needed to process these multiple comparisons are high and, as the theory explains, exponentially challenging to manage. Like the Rosenstrom et al. (2018) study, my own choice of analytic method will be influenced by the concepts behind this theory.
References
Bellman, R. (1957). Dynamic Programming. Princeton, NJ: Princeton University Press.
Lee, C. H., & Yoon, H.-J. (2017). Medical big data: Promise and challenges. Kidney Research and Clinical Practice, 36(1), 3-11. doi: 10.23876/j.krcp.2017.36.1.3.
Mirza, B., Wang, W., Wang, J., Choi, H., Chung, N. C., & Ping, P. (2019). Machine learning and integrative analysis of biomedical big data. Genes, 10(2), 87. doi: 10.3390/genes10020087.
Rosenstrom, T., Torvik, F. A., Ystrom, E., Czajkowski, N. O., Gillespie, N. A., Aggen, S. H., … Reichborn-Kjennerud, T. (2018). Prediction of alcohol use disorder using personality disorder traits: A twin study. Addiction, 113(1), 15-24. doi: 10.1111/add.13951.
Part 10
Ikujiro Nonaka developed the theory of knowledge transformation and flow while working on the idea of organizational knowledge as a strategic business asset and competitive advantage (Kermally, 2005; Moroz & Szkutnik, 2016; Nonaka & Toyama, 2005). The knowledge flow theory emphasizes the importance of capturing and using what the people in an organization know by applying four modes of organizational knowledge conversion: Socialization, Externalization, Combination, and Internalization, or SECI (Kermally, 2005; Nonaka & Toyama, 2005). Applying each knowledge conversion mode to individual knowledge within an organization makes the knowledge more explicit and easier to share and leverage across an organization.
Nonaka’s theory applies to projects and project management activities within an organization. Project management research relies heavily upon theories from other related disciplines such as operations management, strategic management, and organizational leadership. Project knowledge management assimilates knowledge management principles into project management processes and knowledge areas across the project lifecycle (PMI, 2017). Continued project management theory application and development is an important component and contributor to the project management knowledge domain (Johnson, Creasy & Yang Fan, 2015; PMI, 2017).
Creating and applying project knowledge management principles in an organization requires process, technology and organizational support. Abd (2017) uses five theoretical lenses to evaluate the Project Management Institute’s practical project management methodology and framework, A Guide to the Project Management Body of Knowledge, or PMBOK Guide(PMI, 2017). The missing link between the practical contents of the PMBOK Guideand knowledge flow theory is examined in detail (Abd, 2017). The study is based on a qualitative, case-study approach to assess and understand the identified gap between project management knowledge flow practice and knowledge flow theory from the perspective of the working project or program manager. PMI needs to add a new project management knowledge area to their standard for project knowledge management to address the deficiencies between practice and theory in this area (Abd, 2017; Gasik, 2011; Oun, 2016; PMI, 2017)
The problem to be addressed in my study is that new and emerging technologies are not being leveraged by organizations to manage project knowledge (Pridy, 2016). The study will explore new and emerging technologies that project managers can and do apply to effectively manage project knowledge within a project and across all projects in the organization. One key takeaway from learning about Nonaka’s theory for knowledge flow and the resulting SECI model is the need for a specific knowledge flow model to be selected and applied as part of my study. Numerous knowledge flow models exist and they do not all go about defining the process of knowledge management from start to finish in the same way. A second key takeaway is recognizing the effectiveness of qualitative research methods based upon rich data collected from real-world projects and based upon what project managers have experienced when practicing the discipline. Since so much of project management began on the practitioner side of the equation, it makes sense that the new knowledge and theories will build on that practical experience to enrich the knowledge area.
References
Abd, E. H. (2017). Analyzing the project management body of knowledge (PMBOK) through theoretical lenses: A study to enhance the PMBOK through the project management theories. Available from ProQuest Dissertations & Theses Global. (1922647192).
Part 11
A purpose statement communicates the intent of the research study to the reader or reviewer, including why the study is being done in the first place and what the researchers hopes to accomplish with their completed work (Creswell & Creswell, 2018). The study objective or objectives must be clearly stated, just as a project manager must clearly state the objectives of their project and what capabilities will be realized with the outcome of that project. A purpose statement or set of statements also aligns with the research methods that are selected: qualitative, quantitative or mixed methods (Creswell & Creswell, 2018).
The draft purpose statement for my research uses the template and examples provided by Creswell and Creswell (2018) as a framework: The purpose of this proposed qualitative study is to explore new and emerging technologies that project managers can apply to effectively manage project knowledge. At this stage in the research, project knowledge management is generally defined as the process and technologies required to generate, use and distribute project knowledge between individuals at all organizational levels to increase project team capabilities, improve project execu
Get your college paper done by experts
Do my question How much will it cost?Place an order in 3 easy steps. Takes less than 5 mins.
Leave a Reply
Want to join the discussion?Feel free to contribute!