Thursday, November 28, 2019

Limitation of Managerial Economics free essay sample

E conomists are a paragon of virtue, rationality and common sense amidst a sea of ignorance, superstition and irrationality. They are probably right, but, sometimes it is good to state a case in strong terms, to make people think. However, I feel there is a need to redress the balance and point out the many mistakes and limitations of Economics which are stated below. Economics is difficult John Maynard Keynes said economics is very difficult and many people underestimate how difficult it is. In Maths 2+2 always equals 4, but in economics it usually depends on countless variables almost too difficult to take into account. To give one example, the link between the Money supply and inflation. This equation can be used to define a link between money growth and inflation that depends on the evolution of money. Velocity of money suggests there is a correlation between the money supply and inflation. (As most non economists would tell you if you print money you will cause inflation). We will write a custom essay sample on Limitation of Managerial Economics or any similar topic specifically for you Do Not WasteYour Time HIRE WRITER Only 13.90 / page But, in practise the growth of the money supply is influenced by so many variables such as technological change, velocity of circulation and consumer behaviour that M3 growth statistics became almost meaningless. MV=PY is great in theory but in practise it is difficult to make anything out of it. Forecasting the Future It is difficult to forecast the future; yet in economic policy making, it becomes important. Go back to May 2007 and how many economists were predicting a fall in UK house prices of 25% and the deepest recession since the war? I wasnt and certainly not one of the treasury economists, who were predicting stable growth of 2% and a reduction in the governments borrowing. Of course, there were people predicting a house price collapse and they have been proved right. (Though some of them started predicting a house price collapse back in 2000. ) Difficult in knowing where you are One of the great challenges is knowing the current state of the economy. For example, Chinas growth and unemployment figures are always viewed with suspicion. There is great debate about what the US inflation rate is it depends which model you use. Recently, the US GDP statistics were revised meaning that the economy was in recession much earlier than previously thought. How can you make good policy when you dont even know what happened in the past? Let alone predict the future. Using Old Models In a way, this recession is unusual in that it wasnt preceded by an inflationary boom. The government felt that as long as inflation is under control, the economy must be sustainable. However, the mistake was to ignore an asset and lending bubble. The problem is that it is not sufficient to rely on previous experiences. As the economy develops old models become less relevant as they stick to old traditions and data which is useless. Ideology A good economist would be free from ideology and have a willingness to revise theory in light of empirical evidence that doesnt match up to their beliefs or expectations. However, in practise many dislike evidence which doesnt agree with their point of view. For example, some economists place great faith in the virtues of the free market and therefore take a lazy attitude in assuming free markets will always lead to increase economic welfare. The problem is that free markets can often be beneficial. But, at the same time, there will always be exceptions; you cant make generalisations that free markets are always best nor can you make generalisations that free markets are always wrong. Another example of ideological economics could be the assumption that if privatisation works in one country it must be good for other countries too. Free trade is another example. Most economists will tell you free trade is beneficial. But, this doesnt necessarily mean developing countries should always stick to the free trade mantra. There can be exceptions to every rules, for example the infant industry argument.

Sunday, November 24, 2019

How to learn to listen to music, not just hear it

How to learn to listen to music, not just hear it To learn to listen to music, not just hear it. You need the right room, the right equipment, the perfect volume, the perfect spot, and (of course), the embracing of the music. After you have all the proper tools, you can sit and enjoy the music.The first consideration is to listen to music in a comfortable chair. I would highly recommend a good quality Lazy Boy recliner. Now, you need to find the best room to put that chair in, so you can listen to you music. The room can be any normal room with four walls but, the room can't be wide open. For example, it can't be an unfinished basement with concrete walls and a cement floor. The sound will not be able to bounce off the walls and give the effect as if the sound is coming from behind you, as well as in front of you (the surround effect).The lens of a compact disc drive and its associate...A good room to listen in, is a typical family room with sheet rock walls and four ninety degree corners.The second consideration is placement of spe akers. The corners of a room are the perfect spot for your speakers. You shouldn't position them flush against the wall, but put the back of the speaker into the corner, so each side of the speaker is against each wall. For this reason, the bass is extended (louder), and the tweeters, mid-range, and woofers give you their undivided attention.Where to sit is simple, but it takes some easy calculations to find the perfect spot. There is a common rule for a person to experience the full effect of the music. In order to do this, measure the distance between the two speaker cabinets. If the speaker's cabinets are placed twelve feet apart from each other, divide twelve feet...

Thursday, November 21, 2019

Male and female consumer buying behaviours Essay

Male and female consumer buying behaviours - Essay Example Two California companies, Jan Stuart and Inner-Face also launched men’s lines during the 1980s that flopped. â€Å"Skin treatment lines for men have come and gone, because the population just wasn’t ready.† Said Pamela Baxter, VP-Marketing at Aramis. But the expanding number of over-35 men led Aramis execs to believe that there is now a legitimate men’s market. Aramis recently launched Lift Off!, an alpha hydroxy acid (AHA) product for men. The overall men’s cosmetic market is now too tiny to count, Baxter said, but Aramis estimates Lift Off! could eventually account for as much as 12% of sales. However, it is still undecided on an overall marketing strategy. â€Å"It’s easier to sell a man if a woman is with him,† said Baxter. Aramis has two radio commercials in test in Kansas City, Dallas, and Miami. One targets men, and the other is aimed at women (Campbell, 2000). Men are often introduced to skin lotions by their wives or girlfrien ds. When Beauticontrol, a Dallas-based direct-sales cosmetics company, introduced an AHA product called Regeneration last year, â€Å"We had wives writing to us saying their husbands were using the product and loved it,† said Ed Hookfeld, director of product marketing. The company packaged the same lotion in a gray and black box, shot a new ad campaign featuring CEO Dick Heath, and renamed the product, Regeneration for Men. Wendy Liebmann, president of New York-based WSL Marketing, said there is a potential for growth in men’s cosmetics, but cautioned against expecting dramatic sales increases.† You’re talking about educating somebody who washes their face with soap and shaves,† she said (Campbell, 2000). Chad Schexnyder of New Orleans remembers how it used to be; approaching the counter of a local drugstore with a bottle of Noxema in his hand, the cahier accosted him. â€Å"This is for women,† she said with a dismissive air. â€Å"I’v e never seen a man buy Noxema.†

Wednesday, November 20, 2019

Definition of Disability In Relation To Using Website or Computers Dissertation

Definition of Disability In Relation To Using Website or Computers - Dissertation Example They may not be able to see, hear, move, or may not be able to process some types of information easily or at all. They may have difficulty reading or comprehending text. They may not have or be able to use a keyboard or mouse. They may have a text-only screen, a small screen, or a slow Internet connection. They may not speak or understand fluently the language in which the document is written. They may be in a situation where their eyes, ears, or hands are busy or interfered with (e.g., driving to work, working in a loud environment, etc.). They may have an early version of a browser, a different browser entirely, a voice browser, or a different operating system. Content developers must consider these different situations during page design. While there are several situations to consider, each accessible design choice generally benefits several disability groups at once and the Web community as a whole. For example, by using style sheets to control font styles and eliminating the FONT element, HTML authors will have more control over their pages, make those pages more accessible to people with low vision, and by sharing the style sheets, will often shorten page download times for all users. Content developers should make content understandable and navigable. This includes not only making the language clear and simple, but also providing understandable mechanisms for navigating within and between pages. Providing navigation tools and orientation information in pages will maximize accessibility and usability. Not all users can make use of visual clues such as image maps, proportional scroll bars, side-by-side frames, or graphics that guide sighted users of graphical desktop browsers. Users also lose contextual information when... Content developers must consider these different situations during page design. While there are several situations to consider, each accessible design choice generally benefits several disability groups at once and the Web community as a whole. For example, by using style sheets to control font styles and eliminating the FONT element, HTML authors will have more control over their pages, make those pages more accessible to people with low vision, and by sharing the style sheets, will often shorten page download times for all users. Content developers should make content understandable and navigable. This includes not only making the language clear and simple but also providing understandable mechanisms for navigating within and between pages. Providing navigation tools and orientation information in pages will maximize accessibility and usability. Not all users can make use of visual clues such as image maps, proportional scroll bars, side-by-side frames, or graphics that guide sighted users of graphical desktop browsers. Users also lose contextual information when they can only view a portion of a page, either because they are accessing the page one word at a time, or one section at a time (small display, or a magnified display). Without orientation information, users may not be able to understand very large tables, lists, menus, etc. An accessible Internet (web) site is one that has been designed and built in such a way that anybody can get into it and make use of all its facilities, even if they live with a disability.

Sunday, November 17, 2019

My dream computer Research Paper Example | Topics and Well Written Essays - 750 words

My dream computer - Research Paper Example Thinking of a dream computer is a delightful experience. I have been at computers since long and have worked on several types of computers ranging from old Pentium-III computers to present day state of the art computers. One thing common to all these experiences is the wish for an even faster computer. But now it is not only a wish and dream it is a genuine requirement to do several different tasks that cannot be accomplished without this machine or at least may have consumed time that cannot be afford. Therefore, now it is extremely important to work out the details of a computer that can meet day to day requirement of various task discussed ahead. Being a student I have to complete college assignments that may require word processing, presentation, picture editing, drawing, worksheet and equations writing facilities along with simple website development software. Beside this for writing technical reports I use some CASE tools. Math assignments are required to be written in some spe cialized software. Entertainment support is inevitable and special sound and graphic equipment are needed to fulfill the requirements of latest games and videos. Internet support software and hardware also form part of the requirements for a dream computer. Software Requirement Beside Microsoft Office 2007 I prefer to use MathType software which a very convenient way to prepare math assignments. MathType can insert complex equations and provide a wide range of mathematical symbols. It is easy to use and provide excellent edit environment. These math equations cannot be written in Microsoft Office because they are not specialized software for math equations. Adobe Creative Suite 5.5 Web Premium is an excellent student adobe suite which available from adobe at special 80% discount. This is an ultimate solution for all web site and photo editing requirements. The software is a bit complex and requires special training and tutorials to be followed for extend uses and some meaningful con tribution. This software require extra bit of processing power and system resources but it pays off and provide a good value for the money. CloneDVD is my favorite DVD player. It is excellent software which can play nearly all types of media files. It also provides the recording facilities and can run in the background. The software is not resource hungry and can be used while other heavy applications are still running at the computer. The online help and support can help in understanding the software, while routine updates can also be downloaded when available. Software Operating System System Requirements Price Math Type Microsoft Windows 7, Windows Vista, or Windows XP 12 MB free hard disk space. MathType is not RAM-intensive so listing its requirements is not necessary $78.02 Adobe Creative Suite 5.5 Web Premium Microsoft ® Windows ® XP with Service Pack 3; Windows Vista ® Home Premium, Business, Ultimate, or Enterprise with Service Pack 1 (Service Pack 2 recommended); or Windows 7 Intel ® Pentium ® 4 or AMD Athlon ® 64 processor 1GB of RAM or more. Recommended 1280x800 display with qualified hardware-accelerated OpenGL graphics card, 16-bit color, and 256MB of VRAM $449 CloneDVD Microsoft Windows 98/98SE/ME, Windows NT/2000/2003, Windows XP, Windows Vista, Windows 7 Intel CPU with 350 MHz and MMX, or AMD CPU with 450 MHz or higher 64MB RAM or more $69 Computers Researched I have researched three major brands in the market. All of these laptops can fulfill the hardware requirements and support the software discussed in Part-II of this report. The price difference is based on the features and may also be change slightly if some extra features or capabilities are appended to the hardware details provided in the excel sheet. I have finalized the emachine laptop which can fulfills all requirements and will meet the advance requirements for one year. The machine offers some good upgrade options and can be considered for an upgrade later on. The pric e of the computer is neither too high nor too low and falls under

Friday, November 15, 2019

Thalamic Glutamate as a Marker of Global Brain Pathology -MS

Thalamic Glutamate as a Marker of Global Brain Pathology -MS Author contributions: LP design conceptualisation of the study, analysis and interpretation of data, drafting the manuscript for intellectual content. JR design conceptualisation of the study, data collection, analysis and interpretation of data, drafting the manuscript for intellectual content. IRB analysis and interpretation of data, revising the manuscript for intellectual content. GS analysis and interpretation of data KZ data collection RN design conceptualisation of the study, analysis and interpretation of data, drafting the manuscript for intellectual content.[LP1] Disclosures: LP no disclosures. JR no disclosures. IRV no disclosures. GS no disclosures. KZ no disclosures. RN Bayer, Biogen, Genzyme, Merck Serono, Roche honorarium for speaking, advisory boards. Biogen, Genzyme, Novartis funds for organising education, staff. Biogen, Novartis Principal investigator.[LP2] [LP3] Multiple sclerosis Multiple sclerosis (MS) is characterised by demyelination and variable degrees of axonal loss and gliosis. People with MS (pwMS) present with sensory disturbances, spasticity, fatigue, ataxia, pain and urinary dysfunction1. The most common form of MS is relapsing-remitting and 85% of pwMS initially present with it, with most eventually progress to a secondary, progressive phase2. Without adequate treatment, 25% of pwMS become wheelchair-bound3. Charcot was the first to describe the inflammatory demyelinating plaque as a hallmark of MS in the late 19th century4. While white matter lesions (WML) contribute to disability5,6, they are likely not its only drive. Recent evidence supports the concept that grey matter lesions (GML) and atrophy are likely contributors to disability7,8. Furthermore, recent studies have looked at diffuse axonal loss and support the notion that this process drives long-term disability, due to a combination of focal inflammation and cortical damage driven by meningeal inflammation9-13. Large clinical trials in MS infrequently correlate the effect of therapies with brain lesion volumes and atrophy. This is due to the fact that as of today, no automated software exists which is able to consistently calculate WMLs14 and GMLs are grossly underestimated as they are not readily visible on MRI15,16. Lastly, brain atrophy is hard to quantify, can only be measured longitudinally and is subject to non-tissue related (pseudo-atrophy) volume loss subsequent to disease modifying treatment17,18. There is an unmet need for a simple biomarker that can act as a surrogate for neuronal damage in MS for use in observational and interventional studies. Natalizumab Natalizumab (Tysabri) is a disease-modifying treatment given intravenously as a monthly infusion19. In the UK it is licensed as a second-line treatment for severe, rapidly evolving, relapsing-remitting MS. It is directed against the ÃŽÂ ±4 subunit of integrin on lymphocytes and acts as an immune-modulator by inhibiting their migration to the brain20,21. Compared to placebo, it has been shown to reduce relapse rate by 68%. Furthermore, it reduced the risk of disability progression by 42%, defined as a change in EDSS score sustained for 24 weeks21. Magnetic resonance spectroscopy Magnetic resonance spectroscopy (MRS) is a non-invasive MRI sequence that allows identification and quantification of in vivo metabolites present in a small, preselected brain region. Proton nuclei (1H) are most commonly used in studies of the human brain due to their abundance and high sensitivity. MRS sequences distinguish between different metabolites by measuring the frequency at which 1H nuclei flip, which is in turn dependent on the molecular group carrying the hydrogen atom22. Measuring these metabolic changes allows researchers to gain an insight into changes at a cellular and molecular level in the brain, which cannot be acquired using conventional MRI techniques23. The thalamus is a subcortical hub, with multiple reciprocal connections to both white matter tracts and cortical grey matter24. Previous studies evidenced the fact that it is sensitive to pathology occurring in other brain regions25. We speculated that by using the thalamus as our region of interest (ROI), investigated metabolites would give a measure of global neuronal damage. Aims We investigated thalamic MRS as a biomarker for global brain neuronal damage in MS by comparing baseline metabolite concentrations between pwMS and HCs. Metabolites that were found to be statistically significantly different between these two groups at baseline were investigated further. To additionally support using MRS imaging as a surrogate for global central nervous system pathology, we investigated the correlation between these metabolite concentrations in pwMS and total lesion volume. In order to investigate whether thalamic MRS can be used to monitor treatment response, we measured changes in their concentration following treatment with the disease-modifying drug natalizumab. Population Participants aged 21-65 underwent inclusion criteria screening. For the pwMS group, this included satisfying the McDonald criteria 2010, having highly active MS and having been scheduled to initiate natalizumab treatment as part of routine NHS Case. Following ethics approval and written informed consent from participants, 17 pwMS and 12 HCs were recruited to the study. HCs underwent an MRI baseline scan while pwMS underwent a scan at baseline, and follow-up scans at 10 and 56 weeks after initiation of natalizumab treatment. Acquisition of MRS data All experiments were carried out in the same Siemens 3T Magnetom Verio with a 32-channel receiver head coil[LP4], used to acquire combined MRI and 1H-MRS scans. A magnetisation-prepared rapid gradient-echo sequence (MPRAGE) was used to obtain high-definition T1 weighted scans with the following parameters: (repetition time (TR)= 2300s;echo time (TE)= 3ms; inversion time (TI)= 900; 160 sagittal sections; slice thickness 1.0mm; in-plane resolution of 1x1mm2 . A single voxel was placed over the left thalamus. In order to acquire the single-voxel scans, a Point-RESolved Spectroscopy sequence (PRESS) was used which had variable power and optimized relaxation delays (VAPOR) water suppression (TR/TE, 2000/30ms) on a single 15-mm slab. This was aligned to the T1 sequence sections (Figure 2). Four reference transients were used to align the data. The average of 96 transients was used for water suppressed spectra. The volume of interest was 15x15x15mm, voxel size was 3.4mL. These parameters we re also used to acquire reference MRS datasets without water suppression. This was done to obtain an internal water reference, which was used to scale metabolite signals. Double inversion recovery pulse and phase sensitive inversion recovery sequences were also acquired. Lesion volumes White and grey matter lesions were identified on 160-slice T1 scans with co-registered double inversion recovery sequences. Lesions were manually segmented in T1 space using the Imperial College software ImSeg. The images obtained by this process [LP5]were used to derive proportions of grey matter, white matter and total lesion volumes. T1, double inversion recovery pulse and phase sensitive inversion recovery sequences were used to check for presence of lesions in the thalamus. Data processing T1 and spectroscopy data were initially obtained from scans in dicom format (dcm). A modified MATLAB (v.2015b) script was used to convert the T1 scans into nifti format (nii), the single voxel spectroscopy scans into rda format (rda) and to generate mask files in rda format. LCModel (v.6.3-1K) was run by using a second modified MATLAB script, in order to obtain spectroscopy data from 0.2-4.0 ppm. The software is a user-independent fitting routine that works by superimposing spectra obtained in vivo with high-resolution model spectra. It is an accurate and reliable method to quantify MRS data with short echo times (ETà ¢Ã¢â‚¬ °Ã‚ ¤30ms)28,29. Partial volume corrections to explain different concentrations of water in the grey matter (GM), white matter (WM) and cerebrospinal fluid (CSF) were conducted by converting T1 sequences from dicom to nifti format, and segmenting the obtained images using MATLABs SPM8 toolbox. This allowed scaling metabolite concentrations obtained from PRESS sequence with water-suppression, to the waters internal reference signal from the unsuppressed water PRESS-sequence. The segmentation was used to calculate voxel proportions of GM, WM and CSF, which are in turn needed to obtain the water concentration (WCONC) value from the unsuppressed water reference signal used to estimate absolute concentrations of metabolites. Total WCONC values for each voxel were computed in accordance with Section 10.2.2.3 of the LCModel manual29.Eddy-current correction was performed by using LCModel. Relaxation effects were not corrected for, and therefore reported metabolite concentrations will differ from actual ones by an unknown factor. The latter is likely to be negligible, as all reported concentrations will deviate from actual concentrations by this same, unknown factor. As per LCModels manual, metabolite concentrations were multiplied by a factor of 1.04, which amounts to the specific gravity of brain tissue29, and were reported in mmol/L (mM). Data exclusion A heat map (Figure 4, right side) was created in order to check for voxel placement by using FSL view v.3.2.0. T1 sequences and mask files were reoriented to match the Montreal Neurological Institute standard template, followed by brain extraction from the surrounding tissue. T1 sequences and mask files were registered to standard space using the Montreal Neurological Institute template, which consists of 152 averaged brain T1 scans of 2mm resolution. The heat map is a depiction of each voxel mask overlaid onto the che2better template for T1 sequences taken from the mricron software.[LP6] No MRS spectra were removed from the analysis owing to minimal inter-scan variability. Spectra generated by LCModel were checked for overall data quality in accordance with the softwares instruction manual29. 2 baseline HC and 2 pwMS spectra were excluded from data analysis (Table 1). For a metabolite to be investigated, it had to be relevant to MS pathology as evidenced by previous studies, as well as to demonstrate sufficient data quality, measured by having Cramà ©r -Rao lower bounds ratio of 75% of individual scans. Five metabolites were investigated: choline-containing compounds (Cho), glutamate (Glu), myo-inositol (Ins), total creatine (tCr) and total n-acetylaspartate (tNAA) (Table 1). In a given subjects scan, metabolite concentrations with a Cramà ©r-Rao lower bounds (CRLB) value of à ¢Ã¢â‚¬ °Ã‚ ¥15% were excluded from data analysis, as per LCModels manual of instructions. Concentrations exceeding 2 standard deviations (2SD) out with the group mean were also excluded. QCa for entire spectra QC for individual metabolites Participant group Before spectra QC (n) After spectra QC (n) Metabolites (marker of)6 Participant group Before metabolite QC(n) After 1st QCf (n) After 2nd QCg (n) HCsb 12 10 Cho1 (membrane turnover) HCs 10 9 9 pwMS BLc 17 15 pwMS BL 15 12 12 pwMS 10wd 16 16 pwMS 10w 16 16 16 pwMS 56we 16 16 pwMS 56w 16 15 15 Glu2 (metabolism and neurotransmitter activity) HCs 10 6 6 pwMS BL 15 9 8 pwMS 10w 16 14 14 pwMS 56w 16 15 14 Ins3 (glial marker) HCs 10 7 7 pwMS BL 15 14 14 pwMS 10w 16 15 14 pwMS 56w 16 15 15 tCr4 (metabolic activity) HCs 10 10 10 pwMS BL 15 15 14 pwMS 10w 16 16 15 pwMS 56w 16 16 16 tNAA5 (neuronal loss, mitochondrial activity) HCs 10 10 9 pwMS BL 15 15 14 pwMS 10w 16 16 16 pwMS 56w 16 16 15 Statistical analysis Prism GraphPad (v.7) and IBM SPSS Statistics 24 software were used to conduct statistical analysis. Participant demographics results are reported as mean and standard deviation (SD). Metabolite concentrations are reported as mean, standard error of measurement (SEM) and 95% confidence intervals. Parametric tests were used after testing for normal distribution of the data. Unpaired t-tests were used to compare metabolites between pwMS and HCs cross-sectionally. Pearsons coefficient was used to correlate between metabolite concentrations and bilateral lesion volumes. A linear mixed model was used to quantify longitudinal changes in metabolite concentrations in pwMS. MRS data were obtained from 17 pwMS (mean age (SD) was 41.6 (10.6), range 21-58 years) and 12 HCs (mean age (SD) was 41.9 (8.3), range 29-61 years). Mean time since diagnosis in years was 12.1 (10.6) and mean Expanded Disability Status Scale (EDSS) was 4.1 (1.1). People with MS, n 17 Age, mean (SD) 41.6 (10.6) Sex, n (%) M 6 (35) F 11 (65) Years since diagnosis, mean (SD) 12.1 (10.6) EDSS score, mean (SD) 4.1 (1.1) Healthy controls, n 12 Age, mean (SD) 41.9 (8.3) Sex, n (%) M 9 (75) F 3 (25) Lower concentrations of glutamate are found at baseline in the thalami of people with highly active MS A statistically significant difference in the concentration of glutamate was found between the two groups (7.67 ±0.3456 in HCs and 6.55 ±0.232 in pwMS, p=0.016). No significant difference was found between the two groups using other metabolites. Metabolite Healthy controls (n=10) People with MS (n=15) 95% CI Cho 1. 69 ±0.0826,n=9 1.75 ±0.25, n=12 -0.232 0.216 Glu* 7.67 ±0.346, n=6 6.55 ±0.232, n=8 * -2.00 0.253 Ins 3.98 ±0.250, n=7 4.45 ±0.281, n=14 -0.452 1.380 tCr 34 ±0.134, n=10 5.42 ±0.150, n=14 -0.350 0.510 tNAA 8.60 ±0.134, n=9 8.46 ±0.178, n=14 -0.656 0.375 Baseline thalamic glutamate concentrations in pwMS correlate negatively with total lesion volumes Baseline glutamate concentrations in pwMS negatively correlated with T1 scan total lesion volumes (n=8; r=-0.80, p=0.017; Figure 6). No other thalamic metabolite correlated with lesion volumes. Lesion volumes in HCs (n=6) were assumed to be zero and are depicted in Figure 6, but this parameter was excluded from statistical analyses. No lesions were found in the thalami of pwMS in this study. Glutamate concentration correlated even more strongly with left hemisphere lesion volumes (p=0.0091), an expected finding given that the left thalamus was used as the studys ROI. The correlation was least significant when using right hemisphere lesion volumes (p=0.030). These results are reported in Table 3. Sampled lesion load location r, correlation coefficient p-value Left hemisphere -0.84 0.0091 Right hemisphere -0.75 0.030 Both hemispheres/Total -0.80 0.016 Thalamic glutamate concentrations increase following natalizumab treatment Glutamate concentrations measured in the thalami of pwMS increased significantly (p=[LP7]) between the 10 and 56 weeks (n=12 pairs of data-points) follow-up scans. At 56 weeks, no significant difference between the pwMS and HC groups was recorded, suggesting that glutamate levels had normalised[LP8]. No significant difference in glutamate concentration was recorded between baseline and 10 weeks follow-up scans (n=7 pairs of data-points) and between baseline and 56 weeks follow-up (n=7 pairs of data-points).[LP9] This observational study used proton magnetic resonance spectroscopy (1H-MRS) to compare metabolite concentrations in 17 pwMS and 12 HCs. Study findings indicate a lower baseline concentration of glutamate in the thalami of pwMS compared to HCs. In pwMS this correlated negatively with total baseline brain lesion volume, which supports our initial hypothesis that thalamic MRS specifically measuring glutamate can be used as a surrogate for global central nervous system pathology. An increase in glutamate concentrations was recorded following natalizumab treatment between 10 and 56 weeks of follow-up. To our groups knowledge, this is the first 1H-MRS study to identify baseline cross-sectional differences in thalamic glutamate, correlate glutamate concentrations with total lesion volumes, and report longitudinal changes in thalamic glutamate following natalizumab treatment. Thalamic glutamate is a potential surrogate for total brain neuronal damage in highly active MS Glutamate, the chief central nervous system excitatory neurotransmitter is mainly synthesized from glutamine31,32. In addition to its neurotransmitter role, glutamate concentration is closely linked to the Krebs cycle, which reflects the cells metabolic activity. Previous proton MRS studies in MS reported higher levels of glutamate in lesioned white matter of pwMS compared to HCs33,34. One of these studies also reported lower levels of glutamate in lesioned grey matter regions34. The limitation of using white or grey matter lesions as ROIs is the high heterogeneity of these brain regions. With regards to WMLs, their definition includes- among others- active, inactive and remyelinating lesions. As for grey matter, this can be affected by exposure to cytokines from meningeal follicle-like structures or, similarly to WMLs, demyelination13,35,36. Current MRS imaging is unable to discriminate between these different pathologies. Therefore, metabolite concentrations obtained from these ROI s are likely to reflect the aforementioned local pathological changes, rather than global MS pathology. In contrast, the potential advantage of thalamic MRS is that the thalamus is rarely affected by local inflammation in MS37,38. Given that it is a subcortical hub highly connected with numerous other brain areas, this study hypothesised that the thalamus could be used as a biomarker of total brain neuronal damage in highly active MS. Two results in our study support this hypothesis: the decreased concentration of glutamate in pwMS and the negative correlation between glutamate and total brain lesion volume. Lesion volumes in MS have been found to correlate with axonal loss39 and disability40. Moreover, glutamate is mainly found in synaptic vesicles, therefore the decreased thalamic glutamate recorded in pwMS in this study could represent neuronal degeneration and synapse loss. Thalamic glutamate increases following natalizumab treatment Between 10 and 56 weeks of natalizumab treatment our group recorded a significant increase (p=,) in the concentration of thalamic glutamate in pwMS. At the end of the follow-up period, glutamate levels normalised, with no significant difference being recorded between pwMS and HC groups. No significant differences in glutamate concentration were found between baseline and 10 (n=x pairs?) and baseline and 56 weeks (n=x pairs?)[LP10] follow-up scans. It can be hypothesised that the limited sample size of pairs of data-points between baseline and 56 weeks follow-up glutamate prevented us from recording an existing statistically significant difference. With regards to changes in glutamate between baseline and 10 weeks, there could be a significant change in glutamate concentration within this timeframe, which was not picked up due to our limited sample size. It also cannot be excluded that thalamic MRS may take longer to respond to treatment. Previous published literature has shown lower glutamate concentrations in lesioned white matter of pwMS at baseline, which increased following treatment with natalizumab41. This effect can be attributed to the anti-inflammatory proprieties of natalizumab. By preventing production of nitrogen oxide and reactive oxygen species by macrophages, the drug could reduce axonal damage otherwise caused by these compounds42,43. Study limitations The algorithm used my SPM8 is incapable of accurately differentiating between the brighter grey and surrounding white matter, as the image intensity in the thalamus is very close to the intensity of white matter. Therefore the software records a higher white matter proportion in the thalamus than the true one. It should be however noted that this inaccuracy in measuring white/grey matter ratio should not cause any systematic error that would affect overall results. The studys HCs were adequately age-matched but poorly gender-matched to pwMS. Previous studies however reported no significant differences in any of the metabolite concentrations in the brain between different genders44. Therefore, no correction for a gender effect was made. The HC group only had a baseline scan, with no longitudinal data recorded. A useful longitudinal control group may be untreated pwMS. The absence of such a control group is currently however a common limitation, as people with highly active MS are nearly always on treatment. Having no information on the natural history of thalamic MRS in pwMS, it is difficult to interpret the significance of longitudinal changes in glutamate seen in this study. Lastly, albeit the thalamus is seldom affected by inflammatory activity in pwMS, the presence of inflammatory lesions has been previously described45. Such lesions are a confounding factor as they directly influence measured metabolite concentrations. However, based on T1, double inversion recovery pulse and phase sensitive inversion recovery sequences, no thalamic lesions were observed in our study. Future work Studies with larger sample sizes are needed to confirm our baseline findings, as well as to confidently interpret longitudinal changes in glutamate concentrations following natalizumab treatment. The presence of a pwMS untreated control group is not justifiable on ethical and legal grounds, however fu

Tuesday, November 12, 2019

Mother Teresa Essay -- essays research papers

She dedicated her life serving the poor. She loved the unloved, cared for the un-cared, helped the dying, the cripple, and the mentally ill. She served everyone with her love and the love of God. She touched the hearts of those who doubted her because of her love and commitment to God. Mother Teresa lived an extraordinary life.   Ã‚  Ã‚  Ã‚  Ã‚  Agnes Gonxha Bojaxhiu, later named Mother Teresa, was born on August 26, 1910 in Skopje, Yugoslavia. She was born into an Albanian Roman Catholic family. There were three children, one boy and two girls. She was the youngest. She attended the government school. In her teens, Agnes became a member of a youth group in her local parish called Sodality. Through her involvement with their activities guided by Yugoslavian priests, Agnes became interested in missionaries in India. There, letters from Yugoslavian priests working in Bengal were read. Young Agnes was one of the Sodalists who volunteered for the Bengal Mission.   Ã‚  Ã‚  Ã‚  Ã‚  When she turned eighteen, she left home to join the Irish Loreto order, whose Sisters ran a mission in Calcutta, India. Mother Teresa’s first assignment was teaching high school girls in Calcutta from 1929 to 1946. There she taught geography at the St. Mary’s High School. For some years, she was a principal of the school and was also in charge of the Daughters of St. Anne, the Indian religious order attached to the Loreto Sisters.   Ã‚  Ã‚  Ã‚  Ã‚  By December 1, 1928 Agnes Gonxha Bojaxhiu had chosen the name of Sister Mary Teresa of the Child Christ after Teresa of Liseux. â€Å"On May 24, 1937, Sister Teresa committed herself to her vows of poverty, chastity, and obedience for life and in doing so became, as was then usual for Loreto nuns, ‘Mother Teresa’† (Spink 17).   Ã‚  Ã‚  Ã‚  Ã‚  In 1946, she wanted to work directly with the poor. She applied for permission to go out and work among the poor in the slums of the city. â€Å"It was among these people that she felt a call to work, and to spend the rest of her life, in daily contact with them† (Spink 224). Her request to work with the poor was granted. Finally, she changed from the uniform of the Loreto order to the customary cheap Indian sari. Her work started after an intensive course in nursing.   Ã‚  Ã‚  Ã‚  Ã‚  In addition to the Sisters, Mother Teresa founded four other bra... ...d.   Ã‚  Ã‚  Ã‚  Ã‚  Establishing one’s goal and devoting her life to the helpless, this is the story of Mother Teresa. Not only was she a servant of god, she was recognized as a mother to many. â€Å"I'm just a little pencil in his hand. Tomorrow, if he finds somebody more helpless, more hopeless, I think he will do still greater things with her and through her† (Crimp 85). Mother Teresa lived an extraordinary life. With one word to describe her, I would say she was miraculous.   Ã‚  Ã‚  Ã‚  Ã‚   Work Cited Crimp, Susan. Touched by a Saint: Personal Encounters with Mother Teresa. Notre   Ã‚  Ã‚  Ã‚  Ã‚  Dame, Indiana: Sorin Books, 2000. Muggeridge, Malcolm. Something Beautiful for God: Mother Teresa of Calcutta. New   Ã‚  Ã‚  Ã‚  Ã‚  York: Harper & Row, 1971. Spink, Kathryn. Mother Teresa: A Complete Authorized Biography. San Francisco:   Ã‚  Ã‚  Ã‚  Ã‚  HarperSanFrancisco, 1997. -------.  Ã‚  Ã‚  Ã‚  Ã‚  The Miracle of Love: Mother Teresa of Calcutta, her Missionaries of Charity, and   Ã‚  Ã‚  Ã‚  Ã‚  her Co-workers. San Francisco: Harper & Row, 1981.

Sunday, November 10, 2019

Case Study of river pollution Essay

Introduction River pollution has caused loss of lives and imbalances in the ecosystem. People, industries and natural causes contribute to the pollution of rivers. This makes the waters unsafe for both animal and human consumption. Conversely, what happens upstream may not be knowledge to those at the lower part of the river. In consequence, governments have come up with laws and regulations to curtain practices that may render the water harmless. Irrespective of the rules, river pollution still takes place. This study employs literature in the quest of all factors that surround river pollution. The Ganga River This is a river that has its source at southern slopes of the Himalayan ranges which is due to glaciations at Gangotri. It is four thousand metres above sea level. The river flows through mountains for two hundred and fifty kilometers before descending on an elevation of two hundred and eighty eight metres above sea level. Mandakini and Alaknanda are its tributaries. This river carries the largest quantities of silt in the world which is deposited at its delta (Wohl, 2011). Pertaining to Wohl (2012), for a long time, this river has enjoyed its purity but due to human encroachment, it has become much polluted. Purity of river water is dependent on its velocity. The faster it flows, the higher the purity. This river has numerous obstructionsso as to be utilized for irrigation purposes. With the escalation in commerce and communications, many towns have developed along the river. This river is polluted industrial and domestic waste waters, mass bathing as a performance of rituals, defecation at its banks by people who come from low income families, carcasses belonging to animals, human copses both unburned and half burned thrown into the river, agricultural residues from fertilizer and pesticides brought about by surface run off of water and solid garbage that is thrown directly into the river by people (Agre, 2013). In consequence to this, according to Ghosh (2012), the Ganga river is now a poisonous rier which is highly comprised of pollutats. In line with this,  the pollutants also comprise of heavy metals which are capable of causing cancer to the population. Key Players Ministry of Environment and Forests This is the major body in India that deals with all environmental issues at the central government level. It is funds and exercises control over all over bodies and agencies conserve the environment. This body oversees and supervises all the activities and financial spending of these other bodies. The ministry has been urged by some other bodies to change its proposal so as to perk up on controlling pollution for this river (Gopal & Agarwal, 2003). The Central Pollution Control Board (CPCB) This is the body that deals with all issues pertaining to the environment and its pollution in India. This body undertook a study in the year 1981 through to 1982 which enable it to classify methods through which the river is utilized and the pollution load. The report generated by this river gave the genesis of the Ganga Action plan. With reference to this report, it was established that pollution was from pesticide and fertilizers employment in agriculture, industrial wastes, domestic wastes and land use methods. This information was the basis on which the Department of Environment framed a policy (Gopal & Agarwal, 2003). The Ganga Project Directorate (GPD) According to (Jain, 2009), this body was founded in 1985 under the National Ministry of Environment and Forest. The rationale behind the formation of this body was for it to become a secretariat to the CGA and also to be the Apex Nodal Agency for the entire implementation process. Moreover, this body was to synchronize activities of divergent ministries that take part in the administration of funds. This body was thought to be a single investment which would be able to achieve the goal of improving the quality of water. The plan for this body was to be executed by the state governments which would assume management and operational tasks. The work of GPD was to exercise overall supervision. This body was to remain intact until the  completion of the GAP. The goal of this entire plan was to dissuade the wastes generated in the urban dwellings away from the river. This was to be enabled by treating the wastes through recycling and reuse. For efficiency of this plan, it was found out that it was a research was indispensible. This was to ascertain the nature and sources of pollution. In addition, a research would give an underpinning on which the most applicable plan pertaining to the utilization of resources of the Ganga River for forestry, animal husbandry and agriculture would be established. Additionally, the demographic, human and cultural settlement along the banks of the river would be ascertained. This led to the involvement of fourteen universities (Singh, 2007). National Ganga River Basin Authority (NGRBA) This is a body that was set up in the year 2009 as a nodal agency to supervise the coordination of authorities, the planning, monitoring and financing of all activities that are directed towards the eradication of pollution and the conservation of the all rivers. It was chaired by the prime minister and was founded under the NGRBA Act (The Energy and Resource Institute Consultant, 2011). Its activities were supposed to be cover cleaning of rivers in all states. Ganga River was a main target by this body due to an international conference that dealt with environmental issues that had been held two years prior. Through this body, corporate and civil bodies as well as the citizens were supposed to participate with the ultimate goal of alleviating river pollution (Agre, 2013). Foreign Aids Some of the countries and foreign bodies made a decision of partnering wit the Indian government with the chief goal of rescuing this river which is in dire need for intervention. Among them is the Israeli government which was ready to which was in position to cooperate with IITs through provision of technological, knowledge (Nandan, 2012). Additionally, the Australian government also has the goal of contributing the salvation of the Ganga River through funding projects that were designed to thwart the river from industrial pollution trough the AusAID program. The country also pledged to  aid India with expertise who would aid with coming up with better sustainable and safe methods for the management and disposal of the waste generated b y the tanneries. Governance Challenges Challenges that that face the policy and mitigation plan is that, pollution is partly caused by municipal sewage which is a component of the government. Additionally, some of the industrial wastes were found to be extremely toxic and hard to manage. In the same context, the government set up regulations which would control pollution by the industrial sector. A setback that emerged is that some of the industries did not comply hence they were forced to close down. The government had to engage in legal tussles with such companies, a step that led to expenditures and time consumption. With regard to this, commercialization has elevated along the shores of this river. This has led to the establishment of many industries and tanneries along the river, which do not or do not adequately treat their effluent before discharging it to the river. The government has tried several ways even with employment of motivation to perk up on the owners to treat their effluent. This has not yielded much fruit as some of them have not incorporated the plan in their practice (Bharti, 2012). The governance and management of the projects was under the docket of the state governments. They partnered with the non governmental organizations and foreign aid agencies which introduced the conservation plan to new obstructions. This is because the non governmental organizations gave up with their own mandates which were supposed to be complied with by the state governments. This impeded the decision making process. This did not only result in to delays of the entire project but also gave room for justification of contractors’ shortcomings (Chatterjee, 2008). The government is trying to put up mechanisms and projects that will lead to alleviation of pollution to enable the water at least attain bathing quality. With reference to Nandan (2012), this action has faced a blow when some of the members of the National Ganga River Basin Authority (NGRBA) stepped down form the task. This is with the reason that they had found out  that the government was not straight forward with the goal of averting pollution with regard to the Ganga River. Value Conflicts There has been an issue whether to privatize the waters of the Ganga River. Most arguments have been against this. The arguments are based on the thoughts that water is an economic good and with regard to this, it should be utilized for commercial purposes. Some people suggested that the water from the river should be bottled and sold at the market. This is in line with the draft water policy which echoed that due to the economic value of water, it cannot be in provision for free. This means that the water still faces greater chances of overuse. Contested Knowledge Hindus believe that the waters of the Ganga River are holy hence they utilize the river has been employed for ritualistic activities since time in memorial. This has led to the misuse, pollution and overuse. Additionally, with the information about plastics and polythene not being biodegradable, in accordance to Governace Knowledge Center (2012), the high court asked the government to veto the utilization of the same in all cities that are situated along the Ganga River. The court also recommended that the state government should encourage the citizens to indulge in the usage of biodegradable products. This very same ordered the administration to proscribe sewage discharges into the river. The court in deed brought out very good suggestions but it would be a bit challenging the government to implement this because some of the products are packaged in plastic and polythene packages. If people were supposed to avert from the use of plastics and polythene, it certainly means that they do not employ these products in their daily uses. Water recycling has been employed as a chief way of dealing with the effluents generated industries and domestically. There are twenty nine thousand industries in Kapur among which four hundred are tanneries. In accordance with this large transnational companies charged with the task waste water treatment have been set up the ultimate truth is that not all  the water generated by the companies can be treated and used for agriculture year in year out. Subsequently, some of the water has to come back to the river. This is one factor that did not yield fruits in GAP 1 as pointed out by Bharti (2012). Competing Interests The condition of the river has grown from worse to worst. This is on the grounds that those who are in charge of policy and decision making for the whole reclamation process do not hinge on the river for their livelihoods (Thakkar, 2013). Whether the water is clean, or the river flows or not, their lives are not dependent on this. Those whose livelihoods are dependent on this river are nowhere near the position of making key decision. Corresponding to this, there has been prominence on pipes, pumps and novel plants but no strategies for the management and governance of the river regime. For the sake of operation, sewage plants have been established but they do not function to capacity. The quality of their services is poor and no one has been held responsible. This in turn contributes to more pollution. Pertaining to the Gang a campaigns, the river is not supposed to be attached to sewage but the reality on the ground is that the rive r is a sewage in itself in accordance with Thakkar ( 2013). The Ganga campaigns have emphasized on the impeding of the project works at Mandakini, Alaknanda and Bhagirathi tributaries but the government has commissioned the same. This is irrespective of the denial by the Forest Advisory Committee twice to validate the project. Additionally, the Wildlife Institute of India also recommended that the project should not be given a go ahead. Institutional Barriers The Ganga Action Plan which was set up in 1985 was supposed to come to a conclusion by the month of March in the year 1990. According to Gopal and Agarwal (2003), this deadline was not yielded to instead many other deadlines arose form this. To the year 2008, the project was still on and was nowhere near conclusion. This slow pace has been attributed to many factors. The government was found not to release sufficient funds for this project. This has led the in between stagnation of the project. This is  because the government puts the money designated for this project into other uses. GAP was to disseminate its duties by establishing river fronts, enhancing Ghats used for bathing, electric crematoria, dealing with toile complexes, setting up treatment plans for the industrial effluents, laying down treatment plants for sewages and coming up with effective mechanisms for handling municipal wastes that accounted for seventy fie percent of Ganga river pollution. The ministry of environment and forest did not set up a timeline and deadlines for submission of reports about the undertakings of GAP. The court had set up deadlines but this ministry had no strategies of ensuring compliance to the same (Gopal & Agarwal, 2003). GAP itself could not account for its expenditures with reference to Agre (2013). Some of the funds had been misappropriate and most often work had not been accomplished. This was so both at the national level and also by the National River Conservation Directorate (NRCD). In accordance to finances, the stated complained that inadequacy of funds had been the stumbling block that had inhibited them from achieving the goals of this project. On the contrary, the funds that had been issued by the central government had not been effectively and faithfully utilized on the project. Conclusion Ganga River has been encroached and this has lead to extinction of some animal and plant species. In addition, human lives especially for the poor who solely depend on the river for their water uses are rendered susceptible. The government needs to explore its strategies from a serious point of view. All the projects set should be monitored to meet their completion in the set time. All the bodies associated, the people and the industries should carry out activities that perk up on the life of this river. References Agre, P. (2013). River Ganga in dire state of pollution and governance affairs. SERI News , 7 (10), 42-50. Bharti, S. (2012, July 31). Strengthen participatory urban governance to prevent pollution in Ganga at Kanpur and recognise the need to look for decentralized solutions. India Waterportal , pp. 36-42. Chatterjee, S. (2008). Water resources, conservation and management. New Delhi: Atlantic Publishers & Distributors. Ghosh, A. (2012, October 17). Ganga is now a deadly source of cancer, study says. The Times of India , pp. 23-24. Gopal, K. & Agarwal. (2003). River pollution in India and its management. New Delhi: APH Publishing Corporation. Governace Knowledge Center. (2012, December 7). Governace Knowledge Center. Retrieved September 30, 2013, from Allahabad High Court asks Up government to regulate pollution in river Ganga: indiagovernance.gov.in/news.php?id=1861 Jain, A. (2009). River pollution : regeneration and cleaning. New Delhi: A.P.H Publishing Corporation. Nandan, T. (2012, March 14). Israel ready to help India check Ganga pollution. Governance , pp. 22-17. Singh, L. (2007). River Pollution. New Delhi: A.P.H. Publishing Corporation. Thakkar, H. (2013, June 5). The Plight of Severely Polluted Ganges River. Epoch Times , pp. 15-17. The Energy and Resource Institute Consultant. (2011). Environmental and Social Analysis. New Delhi: N ational Ganga River Basin Authority. Wohl, E. (2011). A World of Rivers: Environmental Change on Ten of the World’s Great Rivers. Chicago: University of Chicago Press. Wohl, E. (2012, March 5). The Ganga-Eternally Pure? Global Water Forum , pp. 27-30.

Friday, November 8, 2019

Examining The Knowledge Behind Creation Information Technology Essay Essays

Examining The Knowledge Behind Creation Information Technology Essay Essays Examining The Knowledge Behind Creation Information Technology Essay Essay Examining The Knowledge Behind Creation Information Technology Essay Essay Tacit cognition personal cognition embedded in persons based on their experience and affecting such intangible factors as personal belief, position, and values. Other types of cognition based on intent and usage are [ 5 ] : i‚Â § Know-what This is the cardinal phase of cognition, e.g. people/group/organizations know what they know ( possibly through their formal instruction ) but do nt cognize when and how to use the cognition to work out jobs i‚Â § Know-how Represents the ability to interpret studious, or learned cognition into existent universe consequences, e.g. cognizing when to utilize certain cognition to work out real-world jobs i‚Â § Know-why Goes beyond the know-how phase. This cognition enables persons to travel a measure beyond know-how and create extraordinary purchase by utilizing cognition, including the ability to cover with unknown interactions and unobserved state of affairss i‚Â § Care-why It represents self-motivated creativeness that exists within the persons in a company. This is the lone degree that can non be supported by a cognition direction system but may be supported through motive / human resource patterns. B. Knowledge Creation, Capture and Conversion Knowledge creative activity ever begins with an person or a group of persons, who separately or as a group come up with new thoughts, constructs, merchandise or procedure inventions, etc. Knowledge creative activity may happen through research, invention undertakings, experimentation, observations, etc. Firestone et Al. [ 2 ] suggest that cognition production starts with cognition claim preparation, followed by single and group acquisition, information acquisition, cognition claim rating and eventually, the edifice of organisational cognition. Harmonizing to Nonaka et Al. [ 4 ] , the organisational cognition creative activity / transition procedure is based on a simple model that contains two dimensions. The first dimension shows that lone persons create cognition while the other dimension relates to the interaction between tacit and expressed cognition. These two dimensions constitute the base for specifying the four Knowledge Creation / Conversion Processes Socialization, Externalization, Combination and Internalization. i‚Â § Socialization tacit cognition is converted into silent cognition during treatments, communications, meetings, etc. i‚Â § Externalization tacit cognition is converted into expressed cognition, and embodied in paperss, manuals, etc. i‚Â § Combination explicit cognition is converted into another signifier of explicit cognition i‚Â § Internalization explicit cognition is converted by persons into tacit cognition. hypertext transfer protocol: //www.trainmor-knowmore.eu/img/1.3.2.jpg Figure 1.3 ( 2 ) Knowledge Conversion. Beginning: Nonaka at all ( 1995 ) The 4 different manners of cognition transition construct a cognition spiral without a start or an terminal. This uninterrupted and dynamic procedure has its roots in the behavior of the chief cognition creative activity agent the human being. For illustration, when people are seeking to unite expressed cognition ( i.e. when person uses mathematics and natural philosophies expressions to work out a complicated job ) they might, at the same clip, discourse it with their equals ( other pupils or instructors ) , therefore interchanging tacit cognition with them. Furthermore, they might see a pupil treatment forum looking for solutions where they will hold to project or explicate their job ( related cognition ) when inquiring for more aid. Nonaka et Al. besides see a five-phase theoretical account of the organisational cognition creative activity procedure dwelling of the undermentioned stages: i‚Â § Sharing tacit knowledge corresponds to socialisation ; i‚Â § Making constructs the shared cognition is converted to explicit cognition constructing a new construct ; i‚Â § Justifying constructs the justification of new constructs allows organisations to find if they are truly worthy of chase ; i‚Â § Constructing an original the worthy construct is converted to a theoretical account, paradigm or an operating mechanism, etc. i‚Â § Cross-leveling cognition here the cognition created is expanded across the organisation. hypertext transfer protocol: //www.trainmor-knowmore.eu/img/1.3.3.jpg Fig 1.3 ( 3 ) : Five-phase theoretical account of the organisational knowledge-creation procedure. Beginning: Nonaka at all ( 1995 ) Knowledge gaining control could cross the whole set of activities performed by an organisation, get downing with the organisation of clients and market information, to the aggregation of illustrations of best pattern or lessons learned or the development of a mentoring programme. It is of import to capture both explicit and tacit knowledge even though the latter creates more troubles. Tacit cognition is contained in rumorus, legends, storytelling, norms, beliefs, etc. , while expressed cognition is stored in books, paperss, databases, webs, e-mail, etc. The gaining control of expressed cognition is the systematic attack of capturing, forming, and polishing information in a manner that makes information easy to happen, while besides easing larning and job resolution. Tacit cognition direction is the procedure of capturing the experience and expertness of the person in an organisation and doing it available to anyone who needs it. During cognition creative activity, capturing and codification procedures, new constructs or cognition claims are developed which need, in general, to be tested or validated at a ulterior phase in order to find their veracity and value. This implies that the new constructs are of greater value than the bing 1s. It should non be forgotten, nevertheless, that the bulk of companies and employees accumulate and capture cognition unconsciously through several methods, patterns and state of affairss. Some of them are consistently managed by company direction while others are non as they occur during day-to-day work and pattern ( e.g. acquisition by making, informal meetings, detecting or listening to others, lessons learned, etc ) . The rating of new constructs could be made against the company ends and vision, and its value assessed in footings of improved organizational effectivity and fight. The balanced scorecard is a perfect instrument which relates the cognition position of the company with its fiscal state of affairs, clients, concern procedures and learning/growth positions. However, in the rating of new constructs SMEs should be careful non to come in into clip devouring and dearly-won processs with small obvious value for the company. Research on the balanced scorecard method shows its added value for larger companies, hence, it is non developed farther in this Handbook. If new cognition is coming from experiments or observations, it needs to be analyzed, explained and verified. There is a demand to bring forth hypothesis, for illustration, in order to explicate experimentation or observation consequences, every bit good as to set up conformity between new and bing cognition. The entire cognition pool is besides updated by integrating the new cognition [ 10 ] . By and large, knowledge acquisition from persons or groups can be characterized as the transportation and transmutation of valuable expertness from a cognition beginning ( e.g. , human expert, paperss ) to a cognition depository ( e.g. , organisational memory, intranet, paperss, etc. ) [ 10 ] . Organizational memory constitutes all cognition elements from silent cognition elements ( based on the experience of employees ) , to tangible informations and information, which could be stored in the organisation archives. Unless cognition is embedded into such touchable systems or indirectly accessed through effectual engagement of people in cognition intensive undertakings, the organisation can non leverage the cognition held by its single members. Organizational cognition acquisition is the amplification and articulation of single cognition at the organizational degree so that it is internalized into the house s cognition base . C. Enabling Conditions for Knowledge Creation Given that cognition creative activity is a complex and fuzzed procedure, the chief function of the organisation is to supply the proper context for easing group activities every bit good as the creative activity and accretion of cognition at the single degree. The undermentioned five conditions [ 13 ] are considered as demands in advancing the cognition creative activity coiling described in Figure 1.3 ( 2 ) above: i‚Â § Purpose: The degree of organisational aspiration to its ends is a driver of the cognition spiral. Within concern scenes the attempts to accomplish the ends normally take the signifier of a scheme. From the point of view of organisational cognition creative activity, the kernel of scheme prevarications in developing the organisational capableness to guarantee, make, roll up and work cognition. The most critical component of a corporate scheme is to make a clear vision about what sort of cognition should be developed and to efficaciously implement that vision in practical footings. This procedure is referred in direction literature as scheme operationalisation. This refers to the procedure required for a scheme to be transformed from a vision or a documented program into existent mundane actions with concrete and mensurable consequences. In a KM context this procedure implies the transition of strategic KM visions and ends into determinations and patterns at an operational degree. Given that cognition is really context particular, the operationalisation of KM scheme could besides be referred to as KM customisation reflecting bing organizational construction, civilization, staffing issues, concern operations, merchandises and clients. i‚Â § Autonomy: Autonomy is the 2nd status for advancing the cognition spiral. It increases the motive of persons to make new cognition or original thoughts. By leting persons and groups to move autonomously the organisation may increase the possibility of presenting unexpected chances. Self-organized squads serve as a footing in Nipponese invention creative activity. i‚Â § Fluctuation and originative pandemonium: Fluctuation ( breakdown of modus operandis, wonts, etc. ) and creative pandemonium addition tenseness and concentrate attending on specifying jobs and deciding crises. They promote the cognition spiral by beef uping the subjective committedness of persons every bit good as stimulation of interaction with the external environment. Fluctuation and originative pandemonium act as a trigger for single members to alter their cardinal ways of thought and challenge bing constructs. They besides help to project their hidden silent cognition. i‚Â § Redundancy: In concern organisations, redundancy refers to knowing imbrication of information between employees and sections, etc. about assorted concern activities, direction duties and the company as a whole. It is characterized by the being of information that goes beyond the immediate operational or functional demands of specific organisational members. This does non intend that this cognition is non utile. Rather, it helps rush up the cognition creative activity procedure through sharing of excess information. It is of import at the construct development phase where certain employees, maps or sections have information and cognition beyond their ain functional boundaries, e.g. on other countries of the administration. This external information and cognition can assist them bring forth extra creative and advanced capacity. Redundancy of information enables staff to lend to dialogues more actively and to clearly warrant their thoughts utilizing widely known corporate concern footings or company slang. In add-on, redundancy of information supports smooth corporate hierarchy changes. This is important for administrations with high employee turnover and where there is a attendant hazard of sudden and frequent loss of silent cognition. Job rotary motion is a manner of integrating the benefits of redundancy. i‚Â § Required assortment: An organisations internal diverseness should fit the assortment and complexness of the environment. Supplying equal entree to information within the organisation supports the exchange of different point of views and readings of new information. Organizational members can get by with many unexpected events if they have a assortment of information and experience. This assortment can be enhanced by uniting information otherwise, flexibly and rapidly [ 13 ] . D. Techniques for Knowledge Capture The undermentioned three major attacks to knowledge acquisition from persons and groups are applicable to the gaining control of silent cognition. In many instances, the attacks can be combined [ 10 ] : i‚Â § Interviewing experts structured interviews of capable affair experts is the most frequently used technique to render cardinal silent cognition of an single into more expressed signifiers. In many organisations, structured interviewing is performed through issue interviews that are held when knowing staff are near retirement age. i‚Â § Learning by being told the interviewee expresses and polish his or her cognition and at the same clip, the interviewer or knowledge applied scientist clarifies and validates the cognition therefore rendering the cognition in an expressed signifier. This signifier of cognition acquisition typically involves sphere and undertaking analysis, procedure tracing, and protocol analysis and simulations. Simulations are particularly effectual for ulterior phases of cognition acquisition, formalizing, refinement, and finishing the cognition gaining control procedure. i‚Â § Learning by Observation Observation is an of import tool that can supply a wealth of information. Silent observation is best used to capture the self-generated nature of a peculiar procedure or process. A figure of other techniques may be used to capture silent cognition from persons and from groups, including [ 10, 11 ] : i‚Â § Storytelling Narratives are another first-class vehicle for both capturing and coding silent cognition. An organisational narrative is a elaborate narration of direction actions, employee interactions, and other intra organisational events that are communicated informally within the organisation. Conveying information in a narrative provides a rich context, doing the narrative to stay in the witting memory longer and making more memory hints than is possible with information non in context. Narratives can greatly increase organisational acquisition, communicate common values and regulation sets, and serve as an first-class vehicle for capturing, coding, and conveying valuable silent cognition. i‚Â § Questionnaires or Surveys when a big group of people should be interviewed, a questionnaire could be a first measure, followed by single interviews. The questionnaire could include close-ended and/or open-ended inquiries. The latter are best for deriving more information as they do non restrict the respondent to a set of predefined replies. i‚Â § Brainstorming or Ad-hoc Sessions Sessionss of no more than 30 proceedingss for sharing thoughts in a stimulating and focused atmosphere. They can take topographic point as face-to-face meetings or do usage of engineerings such as instant messaging, e-mail, teleconference, and chat suites. i‚Â § Focus Groups include structured Sessionss in which a group of stakeholders is asked to portion their positions about a antecedently presented solution. i‚Â § Learning Histories ( lessons learned debriefings ) represent a retrospective history of important events that occurred in the organisation s recent yesteryear, as described in the voice of the people who took portion in them. The larning history procedure starts with be aftering which establishes the range of the larning history to be captured. After that participants are asked to portion their analysis, rating, and the judgement they used. Other penetrations emerge and the gaining control and codification of these penetrations helps increase the organisation s brooding capacity. Next, the information that was gathered from the interviews is synthesized into a drumhead format that will do it really easy for others to entree, read, and understand. The content is so written up, validated, and published in order to circulate the learning history and to ground it as portion of the organisational memory. A learning history is therefore a systematic reappraisal of successes a nd failures in order to capture best patterns and lessons learned. i‚Â § Documentation it could include certification from bing systems, archival information, policies and procedural manuals, studies, memos, meeting notes, criterions, e-mails, public ordinances, other ushers, etc. i‚Â § Participation Learning-by-doing or on-the-job-training is priceless both for experience and for obtaining cognition. It is experimental, deductive acquisition that seeks to do sense of happenings and to set up causal links between actions and results. Apprenticeships, internships or traineeships and mentoring are signifiers of experient skilled individuals go throughing cognition to a novitiate. i‚Â § Task Analysis an attack that looks at each key undertaking an expert performs and characterizes the undertakings in footings of requirement knowledge/skills required, effects of mistake, frequence, trouble, and interrelatednesss with other undertakings and persons, every bit good as how the undertaking is perceived by the individual ( everyday, awful, or thirstily anticipated ) . It could be done by observation ( mutely ) or as an interview by the cognition applied scientist. i‚Â § Learning from others can affect activities such as external benchmarking, which involves larning about what the leaders are making in footings of their best patterns, either through publications or site visits, and so accommodating and following their best patterns. Benchmarking helps place better ways of making concern. Other larning beginnings include company acquisitions or amalgamations, go toing conferences and expoundings and commissioning specific surveies. Inviting guest talkers to an organisation presents yet another chance to convey a fresh position or point of position. E. Knowledge Codification Knowledge codification serves the polar function of leting what is known in the organisation to be shared and used jointly. By change overing cognition into a touchable, expressed signifier such as a papers, cognition can be communicated much more widely and with less cost. Knowledge must be codified in order to be understood, maintained and improved upon as portion of corporate memory. People ever used some type of cognition codification during their mundane activities to do communicating and treatments more effectual. Work or concern slang, electronic mail every bit good as computing machine coder s proficient linguistic communication are merely some illustrations. However, it is impossible to codify in a papers or a database the cognition, accomplishments, expertness, understanding and passion of an employee. In this instance, the best solution is to supply a nexus to the beginnings of cognition utilizing a cognition maps, company xanthous pages or a company usher. These issues ar e examined subsequently in this Handbook ( Chapters 3.1.5, 3.1.7 ) The codification of expressed cognition can be achieved through a assortment of techniques such as cognitive function, determination trees, cognition taxonomies, and undertaking analysis [ 10 ] : i‚Â § Cognitive Maps Once expertness, experience, and know-how have been rendered ( made ) explicit, the ensuing content can be represented as a cognitive map. A cognitive map is a representation of the mental theoretical account of a individual s cognition and provides a good signifier of statute cognition. In the map, the nodes represent the key constructs, while the links between them show the interrelatednesss between constructs. Therefore, cognitive function is based on construct function, and allows experts to build cognition theoretical accounts. They could demo multiple positions or positions on the content ( Figure 1.3 ( 4 ) ) . hypertext transfer protocol: //www.trainmor-knowmore.eu/img/1.3.4.jpg Figure 1.3 ( 4 ) : Example of a Concept Map. Beginning: Dalkir ( 2005 ) i‚Â § Decision Trees typically in the signifier of a flow chart, with alternate waies bespeaking the impact of different determinations being made at that occasion point. A determination tree can stand for many regulations, and when you execute the logic by following a certain way, you are efficaciously short-circuiting regulations that are non relevant to the instance in manus ( Figure 1.3 ( 5 ) ) . hypertext transfer protocol: //www.trainmor-knowmore.eu/img/1.3.5.jpg Figure 1.3 ( 5 ) : Example of a Decision Tree. Beginning: Dalkir ( 2005 ) i‚Â § Knowledge Taxonomies Concepts can be viewed as the edifice blocks of cognition and expertness. Taxonomies are basic categorization systems that enable us to depict constructs and their dependences typically in a hierarchal manner. The higher up the construct is placed, the more general or generic the construct is. The lower the construct is placed, the more specific an case it is of the higher-level classs. This attack allows lower or more specific constructs in the taxonomy to straight integrate the properties of the higher degree or the parent constructs ( Figure 1.3 ( 6 ) ) . hypertext transfer protocol: //www.trainmor-knowmore.eu/img/1.3.6.jpg Figure 1.3 ( 6 ) : Example of Knowledge Taxonomy 3.13.3: What are the Implications for Organizational Learning / Training in footings of Tacit Knowledge: A Tacit cognition can be a sustainable competitory advantage. The trouble inherent in tacit cognition transportation is that capable affair experts and cardinal cognition holders may non be cognizant, hence, unable to joint, communicate and depict what they know A Tacit cognition is embedded in group and organisational relationships, nucleus values, premises and beliefs. It is difficult to place, turn up, quantify, map or value A Tacit cognition is embedded in the person A So, the distribution and effectual usage of expressed cognition can frequently be achieved through systematic preparation as it is the planned and organized development of accomplishment, cognition and attitude, required by an person in order to execute a specific occupation or undertaking to a given criterion of public presentation. Furthermore, systematic preparation is planned in a logical sequence where a preparation demand is identified, a program is put in topographic point to turn to the demand and the program is implemented and evaluated and assessed. A The distribution and effectual usage of tacit cognition, is non, nevertheless, easy achieved through systematic preparation. Often such cognitions will non be able to make those who need it without direct, face-to-face contact and the usage of less-structured methods including sink or swim where a individual learns by making and experience, sit by Nellie where they act as an learner to an older, skilled, knowing worker and find . A A 2.5.4.3.A ) A A A A A A A A A Different ways of Learning / Sharing and Using Tacit Knowledge A While preparation is necessary, much of it is wasted due to the flexibleness of the labor market employees can easy come and travel. [ 6 ] Much of an administration s strength its know-how and experience is institution specific. As such, it can non be rehired when people leave, merely learned, with new appointees by and large expected to absorb by osmosis. Many administrations are now utilizing assorted experiential larning techniques to develop staff and to back up them in larning from experiences. A There are two nonvoluntary larning attacks and some planned acquisition dockets. A Unconscious acquisition and incidental acquisition are nonvoluntary attacks to larning. They are unstructured, informal and normally involve chew overing over incidents. It can be argued that this learning consequences in cognition residing in the person or persons concerned. A Most of the more formal, planned larning attacks revolve around retrospective acquisition. Like unconscious and incidental larning it frequently revolves around incidents or activities but with a witting purpose to larn. A Brooding acquisition can happen both proactively and defensively. Kraft for illustration proactively decided to cut back on the fat and sugar content of their merchandises and the size of parts because they wanted to lend to actions on fleshiness and/or because they were afraid corpulence clients would action as with the baccy industry. The usage of instance surveies, internal audits and post-project reappraisals autumn into this class. Defensive acquisition is where an administration sets out to larn from an incident with a position to avoiding it s go oning once more. A Action larning uses a skilled facilitator to enforce a subject of self-reflection and analysis on squad members of single undertakings. The purpose is to enable directors to observe and forestall mistakes and to accurately reassign information or to successfully accomplish ends. A Prospective acquisition is an attack that includes elements of retrospective acquisition along with the more proactive purpose of be aftering to larn before an experience takes topographic point and more strict procedures to capture the more elusive elements of bing informations, information and cognition. Case survey illustrations are provided in lt ; gt ; , Arnold Kransdorff, Gower Publishing Ltd. , 2006. A Benetton for illustration, deliberately learns from experience through test and mistake by experimenting and retaining what has worked at each phase of company development. [ 8 ] A BP has a particular post-project assessment unit to reexamine major undertakings and to compose up instance surveies and lessons, which are so incorporated into alterations of the company s planning guidelines. [ 9 ] A Ford used unwritten debriefing [ 7 ] techniques where interviewees record their experiences anonymously and in their ain words, in a manner that reflected their corporate acquisition experience. The transcripts were so used to pull out penetrations that become a best pattern manual for others set abouting other similar undertakings. [ 10 ] The armed forces, World Bank, Bass, Cable and Wireless and Digital Equipment Corporation have besides used this procedure. A Other tools such as issue interviews can supply beginnings of larning for new staff while experient former employees can besides be brought in to back up larning amongst new staff. 3.13.5: Case study stuff on Organizational Learning / Training and the Implications for Tacit Knowledge Ikujiro Nonaka and Hirotaka Takeuchi s book The Knowledge Creating Company ( 1995 ) brought the construct of silent cognition into the kingdom of corporate invention. In it, they suggest that Nipponese companies are more advanced because they are able to successfully collectivise single tacit cognition to the house. The two research workers give the illustration of the first Nipponese staff of life devising machine, whose development was impossible until the applied scientists interned themselves with one of? gt ; Japan s prima bakers. During their internship, they were able to larn the silent motions required to kneed dough, and so reassign this cognition back to the company. A Nonaka and Takeuchi s history is about the development of the first fully automated bread-making machine for place usage developed by Matsushita. It was introduced in the Nipponese market in 1987 and was a gross revenues success. A Harmonizing to the criterion history, the design squad faced three jobs in developing the machine. The first was how to mechanise the dough-kneading procedure, which is basically silent cognition possessed by maestro bakers . The other two concerned temperature and ingredient variableness. The ideal [ ambient temperature ] was 27 to 28 grades centigrade, yet the fluctuation in Japan ranged between 5 and 35 grades centigrade. Different trade names and sorts of flour and yeast farther complicated the control system . A It is said that in order to work out the dough-kneading job, Ikuko Tanaka was sent to larn how to do staff of life with a celebrated maestro baker. After a period she noticed that the baker was non merely stretching but besides writhing the dough, which turned to be the secret of doing tasty staff of life. At this point, Nonaka and Takeuchi s statement is that silent cognition can be explicated by taking the forms of metaphors, analogies, constructs, hypotheses or theoretical accounts and so integrating them into machines by their interior decorators. Kneading dough is presented as the cardinal illustration. Tanaka was able to reassign her cognition to the applied scientists by utilizing the phrase tortuous stretch to supply a unsmooth image of working. Her petition for a tortuous stretch motion was interpreted by the applied scientists and after a twelvemonth of test and mistake the squad came up with merchandise specifications that successfully reproduced the caput baker s s tretching technique. The squad so materialized this construct, seting it together into a manual, and embodied it in the merchandise. A The temperature job was solved by adding the barm at a ulterior phase in the procedure. This was the manner people had made staff of life in the past and this method was the consequence of the socialisation and externalisation of the squad members tacit knowledge . Here, Nonaka and Takeuchi seem to utilize the term tacit cognition to mention to knowledge, which is easy verbalized, but cipher has thought to advert. A Other instance survey illustrations of company-specific experiential acquisition are provided in Corporate DNA Using Organizational Memory to Better Poor Decision-Making , Arnold Kransdorff, Gower Publishing Ltd. , 2006.

Wednesday, November 6, 2019

My interpretations of the Wizard of Oz Essays

My interpretations of the Wizard of Oz Essays My interpretations of the Wizard of Oz Paper My interpretations of the Wizard of Oz Paper The Wizard of Oz is a popular Childrens Literature in the world. After reading various of interpretation theories in internet. For example, Parable on Populism of Littlefield, Henry , Analogy between the Yellow Brick Road and the Information Superhighway of Gandy, Oscar , The Wizard of Oz as a Secular Myth of Nathanson, Paul and Salman Rushdies Theories of Oz and so on. Actually, there are huge difference thoughts between these scholars, most of them are more than ostensible story beyond The wizard of Oz itself. Take for example, Rushdie rejects the conventional view that its fantasy of escape from reality ends with a comforting return to home, sweet home. On the contrary, Rushdie think it is a film which speaks to the exile. The Wizard of Oz shows that imagination can become reality, that there is no such place as home, or rather that the only home is the one we make for ourselves. Being a grown-ups of both non-native speakers of English and non-western culture, I hold the different view on the story itself. Backing from my first impression of knowing the Wizard of Oz is when I was 10 years old elementary kid. The most attracting part is the interesting characters scarecrow, tin-man and cowardly lion but except for Dorothy-the little girl. In my view, I think the story it self has a clear explanation of finding ourselves. All of characters are looking for something that they thought they dont have. However , is that true? I , as be a Taiwanese, has influenced by American culture little by little since I was a kid. We watch American TV programs , American movies and American value all the time. However, the most unforgettable culture shock for me is Searching for our identity. I still remember there was a time that some typical characters in American soaper. They sometimes get themselves do nothing but thinking, wearing different clothes or fooling around. Those characters seem to be young man mostly. They always answer like I am looking for myself if people ask them. Comparing with western culture to Chinese culture, our education doesnt courage young man to think independently. We tend to follow what our seniors thoughts. In other words, we seldom think about what do we lack in ourselves not mention about the identity of ourselves. According to my personally observation, lots U.S students or European students ,they would like to spend some time which is probably years to travel or work after compulsory education. They make themselves stop moving forward to next step in their life for a while. After getting to know themselves better, they will devote themselves to their further education or work again. Obviously, people who grow up in Chinese cultures seems dont do so. Before getting to know ourselves better, we tend to rush to next step. And the worse thing is that we follow the others view when we encounter with choice. Other parts such as the viewpoint of family value itself in the story which also stands for typical American value. Dorothy is a little girl with full of curiosity and eager to find a way to resolve her problem in reality which could represent for American characters in some respects. By meeting other three main characters, they also stand for some kind of symbols of finding their identity as well. Through consistent will of finding a way to go home in the story, The story then highlights the family values of American culture. On the other hand, it took me spend decade to understand that tin man is used to be a real man and the real reason that he become to be a tin man is because of some accidence. I used to think that tin-man is a robot. I even ever ask other classmates view on tin man in class while our teacher- River was teaching the literature. To my surprised is other classmates also hold the same view as I have before reading the novel. What a coincidence. After all, as far as I am concerned, reading childrens literature is a totally fresh experience because what Ive learned from the story is not only the difference values between western culture and Chinese culture but also the similarity on values which might be an universal tendency of finding out the goal of life. We maybe all lack something, thats why we are here for experiencing this life.

Sunday, November 3, 2019

Introduction to Communication-Improve Your Listening Essay

Introduction to Communication-Improve Your Listening - Essay Example my personal concerns is however a strategy to overcoming the barrier because shifting focus from self is likely to reduce effects of my concerns on my concentration through establishing psychological stability. An effort to concentrating on a message in a communication is another strategy to overcoming the barrier and is likely increase amount of information that I capture (Beebe, Steven & Beebe Susan, 2011). Poor attitude is another barrier to my listening. I am often critical during communication and am quick to identify possible mistakes in a speaker’s presentations. I am equally judgmental over a speaker’s physical appearance and negative attitude shifts my attention from a speaker and associated message. Focusing on message content, rather than its structure and the speaker, is the possible strategy to overcoming the barrier because it can reduce my criticism and improve my ability capture information in communication through focusing on message details (Beebe, Steven & Beebe Susan,

Friday, November 1, 2019

Current Events in Macroeconomics Essay Example | Topics and Well Written Essays - 750 words

Current Events in Macroeconomics - Essay Example They are especially critical of many levels of government intervention, including the proposed guest worker program. The two concur in many areas, including the fact that despite the illegality of this immigration, it benefits the economic status we enjoy by having the illegal immigrants take positions which are those others would not take in service industries like housekeeping and landscaping and other such jobs. The two also state that there are many pluses, including the fact that taxes are often paid by these illegal immigrants. The negatives, unfortunately, are also easily seen, including the use of public education, fire and police protection, and all forms of government assistance including Medicaid, Food Stamps and other such assistance. The overall concept for this article is encompassing human factors, how humanity affects the economy. In this particular case, how the migration and immigration of people affects a particular economy in both good and bad ways. This is subjec t to a great deal of debate as to whether or not such immigration would be of benefit and whether the attempts to change the idea would be of any economic merit. In the Economy section David Wessel writes in A Tricky Move for the Fed, Wall Street Journal June 24, 2006 issue, that the decision to change the interest rate up or down is a tricky proposition in current economic times. The reason for this is that finding an appropriate level of balance is equally as tricky as regaining it once balance has been lost. In the case of current economic standings with inflation on the rise and a slowing economy it's a question of which fire would be the most urgent one to fight. Raising rates could slow the economy and cause a recession, something they don't want; lowering it too much might cause inflation to raise exponentially, also not something favorable in the eyes of economists. Consideration in either case causes concern in the employment factor, the consumer factor is also at issue and then we must also consider how they interact within the economy. Mark Whitehouse reveals, in the piece done in the June 26, 2006 Wall Street Journal titled A Housing Slowdown Can Put the Brakes on a Job Sector but Open Other Opportunities, that the housing boom which our country has been enjoying may be coming to an end and this end makes the appearance of a bad thing. In actuality, the revelations found within this potentially devastating slowdown, could in fact bring about benefits at first unseen. "From a macroeconomic perspective, the housing slowdown, and the attendant slowing of job growth, could be just what the economy needs."1 In essence, the fact that houses are not selling as swiftly as they can be built or as swiftly as someone needs to be transferred may actually prove beneficial in other areas of the economic landscape. This change can further cause a rise in movement elsewhere within the macroeconomic landscape, continuing the cycle that will bring economic relief. Isabelle Lindenmayer writes about the state of the US dollar in her article Dollar is to Enter an Unsteady Week Ahead of Fed Move. As in most cases, the dollar will have its good days and bad days on the market in regard to value, and on occasion, the Fed can influence those good and bad days. In an analysis of expert opinions, two factors,