aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorDiego Novillo <dnovillo@redhat.com>2006-10-11 15:35:49 +0000
committerDiego Novillo <dnovillo@redhat.com>2006-10-11 15:35:49 +0000
commit580e977637f988481b9144898136db4620bc7526 (patch)
tree52adb0b610c3b712c69aa0b38fb3bc625c22048d
parent8a7855eb5097d1067df65754cfdc7f85dd583a36 (diff)
NOTE: WIP. With these changes, the branch does not bootstrap.
* tree-ssa-operands.h (dump_loads_and_stores, debug_loads_and_stores, dump_decl_set, debug_decl_set): Declare. (struct mem_syms_map_d): Declare. (mem_syms_map_t): Declare. (add_loads_and_stores, move_loads_and_stores, delete_loads_and_stores): Declare. * tree-into-ssa.c (struct ssa_name_info): Add field 'reached_syms'. (syms_stored_in_bb): New local variable. (syms_with_phi_in_bb): New local variable. (syms_to_factor_in_bb): New local variable. (struct unfactored_phis_d): Declare. (unfactored_phis_t): Declare. (first_unfactored_phi): New local variable. (last_unfactored_phi): New local variable. (unfactored_phis): New local variable. (last_dom_num): New local variable. (mem_syms_tbl): Remove. (dump_stored_syms): New. (debug_stored_syms): New. (dump_unfactored_phis): New. (debug_unfactored_phis): New. (dump_unfactored_phi): New. (debug_unfactored_phi): New. (get_dom_num): New. (get_name_dom_num): New. (set_next_dom_num): New. (unfactored_phis_hash): New. (unfactored_phis_eq): New. (unfactored_phis): New. (mem_syms_hash): Remove. (mem_syms_eq): Remove. (mem_syms_free): Remove. (add_syms_with_phi): New. (add_stored_syms): New. (add_stored_sym): New. (mark_def_sites): Do not process statements that reference memory. (compute_idf): Rename from find_idf. When inserting PHI nodes for .MEM, determine set of symbols to associate initially using SYMS_STORED_IN_BB and SYMS_WITH_PHI_IN_BB. (insert_phi_nodes_for): Call create_factored_phi_node if needed. (insert_phi_nodes): Likewise. (rewrite_memory_stmt): Remove. (rewrite_update_init_block): Call set_next_dom_num. PHI nodes for .MEM become the current definition for every symbol factored in them. (preserve_needed_names_in_vops): Remove. (rewrite_vops): New. (rewrite_update_stmt_vops): Call it. (rewrite_update_stmt): Call set_next_dom_num. (add_reached_sym): New. (lookup_unfactored_phi): New. (get_unfactored_phi): New. (split_factored_phi): New. (replace_factored_phi_argument): New. (rewrite_update_phi_arguments): Call it. (init_ssa_renamer): Initialize last_dom_num, syms_stored_in_bb, syms_with_phi_in_bb, syms_to_factor_in_bb. (fini_ssa_renamer): Deallocate syms_stored_in_bb, syms_with_phi_in_bb, syms_to_factor_in_bb. (prepare_block_for_update): Fill-in syms_with_phi_in_bb and syms_stored_in_bb. (delete_update_ssa): Deallocate unfactored_phis and blocks_with_phis_to_rewrite. (add_to_fixup_queues): New. (compute_currdefs_for): New. (fixup_unfactored_phis): New. (update_ssa): Insert PHI nodes for .MEM separately. Do not try to insert PHI nodes for memory symbols. Call fixup_unfactored_phis. * tree-dump.c (dump_options): Add entry for "memsyms". * tree-pretty-print.c (debug_generic_expr): Add flag TDF_MEMSYMS. (debug_tree_chain): Likewise. (dump_symbols): New. (dump_generic_node): Call it if TDF_MEMSYMS is given. (dump_vops): Likewise. * tree-ssa-loop-manip.c (split_loop_exit_edge): Ignore memory symbols. * tree-pass.h (TDF_MEMSYMS): Define. * bitmap.c (bitmap_singleton_p): New. * bitmap.h (bitmap_singleton_p): Declare. * tree-phinodes.c (release_phi_node): Call delete_loads_and_stores. (resize_phi_node): Call move_loads_and_stores. (create_phi_node_1): New. (create_phi_node): Call it. (create_factored_phi_node): New. (remove_phi_node): Add new boolean argument RELEASE_LHS_P. If it is true, call release_ssa_name. Update all callers. * tree-ssa-alias.c (init_alias_info): The first time aliases are computed mark every memory symbol to be renamed. (setup_pointers_and_addressables): (dump_points_to_info_for): * timevar.def (TV_TREE_SSA_PHI_UNFACTOR): New. (TV_TREE_SSA_FIX_UNFACTORED_UD): New. * tree-vectorizer.c (slpeel_update_phi_nodes_for_guard1): Only handle LCSSA PHIs for GIMPLE registers. * tree-vectorizer.h (vect_memsyms_to_rename): Rename from vect_vnames_to_rename. Update all users. * tree-flow-inline.h (zero_imm_uses_p): New. * tree-ssa.c (verify_ssa_name): Do not verify sub-variables for memory symbols. Verify that every memory name uses the canonical MEM_0 definition. (verify_use): Add argument SYMS_IN_FACTORED_PHIS. Update all users. Add verification for PHI nodes that reference memory. (stmt_references_memory_p): Move to tree-ssa-operands.c (walk_use_def_chains_1): Guard against NULL PHI arguments. * tree-ssa-loop-prefetch.c (SSA_NAME_VAR): Remove. * tree-flow.h (struct stmt_ann_d): Add bitfield references_memory. (create_factored_phi_node): Declare. * tree-cfg.c (bsi_remove): Call delete_loads_and_stores if needed. (bsi_replace): Likewise. (tree_make_forwarder_block): Call create_factored_phi_node if needed. (tree_duplicate_bb): Likewise. * tree-ssanames.c (release_defs): Remove assertion against .MEM. * tree-ssa-operands.c (mem_syms_tbl): New local variable. (mem_syms_hash): New. (mem_syms_eq): New. (mem_syms_free): New. (init_ssa_operands): Intialize mem_syms_tbl. (fini_ssa_operands): Delete mem_syms_tbl. (add_virtual_operator): Do nothing if aliases_computed_p is false. (add_mem_symbol): Do nothing if stored_syms is not set. (get_mem_symbols_in_indirect_ref): Process statement even if aliases have not been computed. (add_stmt_operand): Mark statement as referencing memory if needed. (get_indirect_ref_operands): Likewise. (get_tmr_operands): Likewise. (get_call_expr_operands): Likewise. (get_asm_expr_operands): Likewise. (build_ssa_operands): Initialize references_memory annotation to false. If has_volatile_ops has been marked by the parser, also set references_memory. (get_loads_and_stores): Change interface to return sets of loads and stores in a structure. Update all callers. (delete_loads_and_stores): New. (add_loads_and_stores): New. (move_loads_and_stores): New. (update_loads_and_stores): New. (update_stmt_operands): Call update_loads_and_stores if needed. (mark_difference_for_renaming): Compute difference without clobbering the input sets. (stmt_references_memory_p): Move from tree-ssa.c. Check the statement annotation. git-svn-id: https://gcc.gnu.org/svn/gcc/branches/mem-ssa@117636 138bc75d-0d04-0410-961f-82ee72b054a4
-rw-r--r--gcc/ChangeLog.mem-ssa165
-rw-r--r--gcc/bitmap.c32
-rw-r--r--gcc/bitmap.h3
-rw-r--r--gcc/lambda-code.c8
-rw-r--r--gcc/timevar.def2
-rw-r--r--gcc/tree-cfg.c39
-rw-r--r--gcc/tree-dfa.c15
-rw-r--r--gcc/tree-dump.c1
-rw-r--r--gcc/tree-flow-inline.h7
-rw-r--r--gcc/tree-flow.h7
-rw-r--r--gcc/tree-gimple.c2
-rw-r--r--gcc/tree-into-ssa.c2021
-rw-r--r--gcc/tree-nrv.c17
-rw-r--r--gcc/tree-outof-ssa.c6
-rw-r--r--gcc/tree-pass.h2
-rw-r--r--gcc/tree-phinodes.c59
-rw-r--r--gcc/tree-pretty-print.c60
-rw-r--r--gcc/tree-scalar-evolution.c12
-rw-r--r--gcc/tree-ssa-alias.c32
-rw-r--r--gcc/tree-ssa-dce.c2
-rw-r--r--gcc/tree-ssa-dom.c2
-rw-r--r--gcc/tree-ssa-loop-im.c44
-rw-r--r--gcc/tree-ssa-loop-ivopts.c7
-rw-r--r--gcc/tree-ssa-loop-manip.c4
-rw-r--r--gcc/tree-ssa-loop-prefetch.c3
-rw-r--r--gcc/tree-ssa-operands.c537
-rw-r--r--gcc/tree-ssa-operands.h20
-rw-r--r--gcc/tree-ssa-pre.c2
-rw-r--r--gcc/tree-ssa-structalias.c17
-rw-r--r--gcc/tree-ssa.c116
-rw-r--r--gcc/tree-ssanames.c1
-rw-r--r--gcc/tree-stdarg.c29
-rw-r--r--gcc/tree-vect-transform.c19
-rw-r--r--gcc/tree-vectorizer.c112
-rw-r--r--gcc/tree-vectorizer.h2
-rw-r--r--gcc/tree.h2
36 files changed, 2381 insertions, 1028 deletions
diff --git a/gcc/ChangeLog.mem-ssa b/gcc/ChangeLog.mem-ssa
index 20c9f418fe6..17c822d39a3 100644
--- a/gcc/ChangeLog.mem-ssa
+++ b/gcc/ChangeLog.mem-ssa
@@ -1,3 +1,168 @@
+2006-10-11 Diego Novillo <dnovillo@redhat.com>
+
+ NOTE: WIP. With these changes, the branch does not bootstrap.
+
+ * tree-ssa-operands.h (dump_loads_and_stores,
+ debug_loads_and_stores, dump_decl_set, debug_decl_set):
+ Declare.
+ (struct mem_syms_map_d): Declare.
+ (mem_syms_map_t): Declare.
+ (add_loads_and_stores, move_loads_and_stores,
+ delete_loads_and_stores): Declare.
+ * tree-into-ssa.c (struct ssa_name_info): Add field
+ 'reached_syms'.
+ (syms_stored_in_bb): New local variable.
+ (syms_with_phi_in_bb): New local variable.
+ (syms_to_factor_in_bb): New local variable.
+ (struct unfactored_phis_d): Declare.
+ (unfactored_phis_t): Declare.
+ (first_unfactored_phi): New local variable.
+ (last_unfactored_phi): New local variable.
+ (unfactored_phis): New local variable.
+ (last_dom_num): New local variable.
+ (mem_syms_tbl): Remove.
+ (dump_stored_syms): New.
+ (debug_stored_syms): New.
+ (dump_unfactored_phis): New.
+ (debug_unfactored_phis): New.
+ (dump_unfactored_phi): New.
+ (debug_unfactored_phi): New.
+ (get_dom_num): New.
+ (get_name_dom_num): New.
+ (set_next_dom_num): New.
+ (unfactored_phis_hash): New.
+ (unfactored_phis_eq): New.
+ (unfactored_phis): New.
+ (mem_syms_hash): Remove.
+ (mem_syms_eq): Remove.
+ (mem_syms_free): Remove.
+ (add_syms_with_phi): New.
+ (add_stored_syms): New.
+ (add_stored_sym): New.
+ (mark_def_sites): Do not process statements that reference
+ memory.
+ (compute_idf): Rename from find_idf.
+ When inserting PHI nodes for .MEM, determine set of symbols to
+ associate initially using SYMS_STORED_IN_BB and
+ SYMS_WITH_PHI_IN_BB.
+ (insert_phi_nodes_for): Call create_factored_phi_node if
+ needed.
+ (insert_phi_nodes): Likewise.
+ (rewrite_memory_stmt): Remove.
+ (rewrite_update_init_block): Call set_next_dom_num.
+ PHI nodes for .MEM become the current definition for every
+ symbol factored in them.
+ (preserve_needed_names_in_vops): Remove.
+ (rewrite_vops): New.
+ (rewrite_update_stmt_vops): Call it.
+ (rewrite_update_stmt): Call set_next_dom_num.
+ (add_reached_sym): New.
+ (lookup_unfactored_phi): New.
+ (get_unfactored_phi): New.
+ (split_factored_phi): New.
+ (replace_factored_phi_argument): New.
+ (rewrite_update_phi_arguments): Call it.
+ (init_ssa_renamer): Initialize last_dom_num,
+ syms_stored_in_bb, syms_with_phi_in_bb, syms_to_factor_in_bb.
+ (fini_ssa_renamer): Deallocate syms_stored_in_bb,
+ syms_with_phi_in_bb, syms_to_factor_in_bb.
+ (prepare_block_for_update): Fill-in syms_with_phi_in_bb and
+ syms_stored_in_bb.
+ (delete_update_ssa): Deallocate unfactored_phis and
+ blocks_with_phis_to_rewrite.
+ (add_to_fixup_queues): New.
+ (compute_currdefs_for): New.
+ (fixup_unfactored_phis): New.
+ (update_ssa): Insert PHI nodes for .MEM separately.
+ Do not try to insert PHI nodes for memory symbols.
+ Call fixup_unfactored_phis.
+ * tree-dump.c (dump_options): Add entry for "memsyms".
+ * tree-pretty-print.c (debug_generic_expr): Add flag
+ TDF_MEMSYMS.
+ (debug_tree_chain): Likewise.
+ (dump_symbols): New.
+ (dump_generic_node): Call it if TDF_MEMSYMS is given.
+ (dump_vops): Likewise.
+ * tree-ssa-loop-manip.c (split_loop_exit_edge): Ignore memory
+ symbols.
+ * tree-pass.h (TDF_MEMSYMS): Define.
+ * bitmap.c (bitmap_singleton_p): New.
+ * bitmap.h (bitmap_singleton_p): Declare.
+ * tree-phinodes.c (release_phi_node): Call
+ delete_loads_and_stores.
+ (resize_phi_node): Call move_loads_and_stores.
+ (create_phi_node_1): New.
+ (create_phi_node): Call it.
+ (create_factored_phi_node): New.
+ (remove_phi_node): Add new boolean argument RELEASE_LHS_P. If
+ it is true, call release_ssa_name.
+ Update all callers.
+ * tree-ssa-alias.c (init_alias_info): The first time aliases
+ are computed mark every memory symbol to be renamed.
+ (setup_pointers_and_addressables):
+ (dump_points_to_info_for):
+ * timevar.def (TV_TREE_SSA_PHI_UNFACTOR): New.
+ (TV_TREE_SSA_FIX_UNFACTORED_UD): New.
+ * tree-vectorizer.c (slpeel_update_phi_nodes_for_guard1): Only
+ handle LCSSA PHIs for GIMPLE registers.
+ * tree-vectorizer.h (vect_memsyms_to_rename): Rename from
+ vect_vnames_to_rename. Update all users.
+ * tree-flow-inline.h (zero_imm_uses_p): New.
+ * tree-ssa.c (verify_ssa_name): Do not verify sub-variables
+ for memory symbols.
+ Verify that every memory name uses the canonical MEM_0
+ definition.
+ (verify_use): Add argument SYMS_IN_FACTORED_PHIS. Update all
+ users.
+ Add verification for PHI nodes that reference memory.
+ (stmt_references_memory_p): Move to tree-ssa-operands.c
+ (walk_use_def_chains_1): Guard against NULL PHI arguments.
+ * tree-ssa-loop-prefetch.c (SSA_NAME_VAR): Remove.
+ * tree-flow.h (struct stmt_ann_d): Add bitfield
+ references_memory.
+ (create_factored_phi_node): Declare.
+ * tree-cfg.c
+ (bsi_remove): Call delete_loads_and_stores if needed.
+ (bsi_replace): Likewise.
+ (tree_make_forwarder_block): Call create_factored_phi_node if
+ needed.
+ (tree_duplicate_bb): Likewise.
+ * tree-ssanames.c (release_defs): Remove assertion against
+ .MEM.
+ * tree-ssa-operands.c (mem_syms_tbl): New local variable.
+ (mem_syms_hash): New.
+ (mem_syms_eq): New.
+ (mem_syms_free): New.
+ (init_ssa_operands): Intialize mem_syms_tbl.
+ (fini_ssa_operands): Delete mem_syms_tbl.
+ (add_virtual_operator): Do nothing if aliases_computed_p is
+ false.
+ (add_mem_symbol): Do nothing if stored_syms is not set.
+ (get_mem_symbols_in_indirect_ref): Process statement even if
+ aliases have not been computed.
+ (add_stmt_operand): Mark statement as referencing memory if
+ needed.
+ (get_indirect_ref_operands): Likewise.
+ (get_tmr_operands): Likewise.
+ (get_call_expr_operands): Likewise.
+ (get_asm_expr_operands): Likewise.
+ (build_ssa_operands): Initialize references_memory annotation
+ to false.
+ If has_volatile_ops has been marked by the parser, also set
+ references_memory.
+ (get_loads_and_stores): Change interface to return sets of
+ loads and stores in a structure. Update all callers.
+ (delete_loads_and_stores): New.
+ (add_loads_and_stores): New.
+ (move_loads_and_stores): New.
+ (update_loads_and_stores): New.
+ (update_stmt_operands): Call update_loads_and_stores if
+ needed.
+ (mark_difference_for_renaming): Compute difference without
+ clobbering the input sets.
+ (stmt_references_memory_p): Move from tree-ssa.c. Check the
+ statement annotation.
+
2006-07-07 Diego Novillo <dnovillo@redhat.com>
Mainline merge as of 2006-07-06 (@115232).
diff --git a/gcc/bitmap.c b/gcc/bitmap.c
index efc789a3211..a054450111f 100644
--- a/gcc/bitmap.c
+++ b/gcc/bitmap.c
@@ -579,6 +579,38 @@ bitmap_count_bits (bitmap a)
}
+/* Return true if bitmap A is a singleton set (i.e., it contains only
+ one element). */
+
+bool
+bitmap_singleton_p (bitmap a)
+{
+ bitmap_element *elt;
+ unsigned long count = 0;
+ unsigned ix;
+
+ if (bitmap_empty_p (a))
+ return false;
+
+ /* If there are more than two entries in the list, the set cannot be
+ a singleton. */
+ if (a->first->next)
+ return false;
+
+ for (count = 0, ix = 0; ix != BITMAP_ELEMENT_WORDS; ix++)
+ {
+#if GCC_VERSION >= 3400
+ count += __builtin_popcountl (a->first->bits[ix]);
+#else
+ count += bitmap_popcount (a->first->bits[ix]);
+#endif
+ if (count > 1)
+ return false;
+ }
+
+ return true;
+}
+
/* Return the bit number of the first set bit in the bitmap. The
bitmap must be non-empty. */
diff --git a/gcc/bitmap.h b/gcc/bitmap.h
index d11fa46243b..eca956347a5 100644
--- a/gcc/bitmap.h
+++ b/gcc/bitmap.h
@@ -103,6 +103,9 @@ extern bool bitmap_intersect_compl_p (bitmap, bitmap);
/* Count the number of bits set in the bitmap. */
extern unsigned long bitmap_count_bits (bitmap);
+/* Determine whether a bitmap is a singleton set. */
+extern bool bitmap_singleton_p (bitmap);
+
/* Boolean operations on bitmaps. The _into variants are two operand
versions that modify the first source operand. The other variants
are three operand versions that to not destroy the source bitmaps.
diff --git a/gcc/lambda-code.c b/gcc/lambda-code.c
index 2a03fd31cd5..02cf755665d 100644
--- a/gcc/lambda-code.c
+++ b/gcc/lambda-code.c
@@ -2403,13 +2403,9 @@ perfect_nestify (struct loops *loops,
}
e = redirect_edge_and_branch (single_succ_edge (preheaderbb), headerbb);
- /* Remove the exit phis from the old basic block. Make sure to set
- PHI_RESULT to null so it doesn't get released. */
+ /* Remove the exit phis from the old basic block. */
while (phi_nodes (olddest) != NULL)
- {
- SET_PHI_RESULT (phi_nodes (olddest), NULL);
- remove_phi_node (phi_nodes (olddest), NULL);
- }
+ remove_phi_node (phi_nodes (olddest), NULL, false);
/* and add them back to the new basic block. */
while (VEC_length (tree, phis) != 0)
diff --git a/gcc/timevar.def b/gcc/timevar.def
index 28b0b766eba..f0a2d7aa23e 100644
--- a/gcc/timevar.def
+++ b/gcc/timevar.def
@@ -81,6 +81,8 @@ DEFTIMEVAR (TV_TREE_INSERT_PHI_NODES , "tree PHI insertion")
DEFTIMEVAR (TV_TREE_SSA_REWRITE_BLOCKS, "tree SSA rewrite")
DEFTIMEVAR (TV_TREE_SSA_OTHER , "tree SSA other")
DEFTIMEVAR (TV_TREE_SSA_INCREMENTAL , "tree SSA incremental")
+DEFTIMEVAR (TV_TREE_SSA_PHI_UNFACTOR , "tree SSA PHI unfactoring")
+DEFTIMEVAR (TV_TREE_SSA_FIX_UNFACTORED_UD , "tree SSA fixup unfactored PHIs")
DEFTIMEVAR (TV_TREE_OPS , "tree operand scan")
DEFTIMEVAR (TV_TREE_SSA_DOMINATOR_OPTS , "dominator optimization")
DEFTIMEVAR (TV_TREE_SRA , "tree SRA")
diff --git a/gcc/tree-cfg.c b/gcc/tree-cfg.c
index 298a8f57d95..b33cbb4bbad 100644
--- a/gcc/tree-cfg.c
+++ b/gcc/tree-cfg.c
@@ -1301,7 +1301,7 @@ replace_uses_by (tree name, tree val)
}
}
- gcc_assert (num_imm_uses (name) == 0);
+ gcc_assert (zero_imm_uses_p (name));
/* Also update the trees stored in loop structures. */
if (current_loops)
@@ -1356,13 +1356,12 @@ tree_merge_blocks (basic_block a, basic_block b)
appear as arguments of the phi nodes. */
copy = build2 (MODIFY_EXPR, void_type_node, def, use);
bsi_insert_after (&bsi, copy, BSI_NEW_STMT);
- SET_PHI_RESULT (phi, NULL_TREE);
SSA_NAME_DEF_STMT (def) = copy;
}
else
replace_uses_by (def, use);
- remove_phi_node (phi, NULL);
+ remove_phi_node (phi, NULL, false);
}
/* Ensure that B follows A. */
@@ -1990,7 +1989,7 @@ remove_phi_nodes_and_edges_for_unreachable_block (basic_block bb)
while (phi)
{
tree next = PHI_CHAIN (phi);
- remove_phi_node (phi, NULL_TREE);
+ remove_phi_node (phi, NULL_TREE, true);
phi = next;
}
@@ -2253,7 +2252,7 @@ find_case_label_for_value (tree switch_expr, tree val)
void
tree_dump_bb (basic_block bb, FILE *outf, int indent)
{
- dump_generic_bb (outf, bb, indent, TDF_VOPS);
+ dump_generic_bb (outf, bb, indent, TDF_VOPS|TDF_MEMSYMS);
}
@@ -2863,6 +2862,9 @@ bsi_remove (block_stmt_iterator *i, bool remove_eh_info)
mark_stmt_modified (t);
if (remove_eh_info)
remove_stmt_from_eh_region (t);
+
+ if (stmt_references_memory_p (t))
+ delete_loads_and_stores (t);
}
@@ -2929,6 +2931,8 @@ bsi_replace (const block_stmt_iterator *bsi, tree stmt, bool update_eh_info)
}
delink_stmt_imm_use (orig_stmt);
+ if (stmt_references_memory_p (orig_stmt))
+ delete_loads_and_stores (orig_stmt);
*bsi_stmt_ptr (*bsi) = stmt;
mark_stmt_modified (stmt);
update_modified_stmts (stmt);
@@ -3945,12 +3949,20 @@ tree_make_forwarder_block (edge fallthru)
if (single_pred_p (bb))
return;
- /* If we redirected a branch we must create new phi nodes at the
+ /* If we redirected a branch we must create new PHI nodes at the
start of BB. */
for (phi = phi_nodes (dummy); phi; phi = PHI_CHAIN (phi))
{
var = PHI_RESULT (phi);
- new_phi = create_phi_node (var, bb);
+
+ if (SSA_NAME_VAR (var) == mem_var)
+ {
+ bitmap syms = get_loads_and_stores (phi)->stores;
+ new_phi = create_factored_phi_node (var, bb, syms);
+ }
+ else
+ new_phi = create_phi_node (var, bb);
+
SSA_NAME_DEF_STMT (var) = new_phi;
SET_PHI_RESULT (phi, make_ssa_name (SSA_NAME_VAR (var), phi));
add_phi_arg (new_phi, PHI_RESULT (phi), fallthru);
@@ -4251,7 +4263,18 @@ tree_duplicate_bb (basic_block bb)
the incoming edges have not been setup yet. */
for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
{
- tree copy = create_phi_node (PHI_RESULT (phi), new_bb);
+ tree lhs = PHI_RESULT (phi);
+ tree lhs_sym = SSA_NAME_VAR (lhs);
+ tree copy;
+
+ if (lhs_sym == mem_var)
+ {
+ bitmap syms = get_loads_and_stores (phi)->stores;
+ copy = create_factored_phi_node (PHI_RESULT (phi), new_bb, syms);
+ }
+ else
+ copy = create_phi_node (PHI_RESULT (phi), new_bb);
+
create_new_def_for (PHI_RESULT (copy), copy, PHI_RESULT_PTR (copy));
}
diff --git a/gcc/tree-dfa.c b/gcc/tree-dfa.c
index e746da771d9..0bca96ffa8a 100644
--- a/gcc/tree-dfa.c
+++ b/gcc/tree-dfa.c
@@ -768,18 +768,9 @@ mark_symbols_for_renaming (tree stmt)
if (stmt_references_memory_p (stmt))
{
- bitmap loads, stores;
-
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
-
- get_loads_and_stores (stmt, loads, stores);
-
- mark_set_for_renaming (loads);
- mark_set_for_renaming (stores);
-
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+ mem_syms_map_t mp = get_loads_and_stores (stmt);
+ mark_set_for_renaming (mp->loads);
+ mark_set_for_renaming (mp->stores);
}
/* Mark all the GIMPLE register operands for renaming. */
diff --git a/gcc/tree-dump.c b/gcc/tree-dump.c
index d97dc83af13..fc24604135b 100644
--- a/gcc/tree-dump.c
+++ b/gcc/tree-dump.c
@@ -784,6 +784,7 @@ static const struct dump_option_value_info dump_options[] =
{"lineno", TDF_LINENO},
{"uid", TDF_UID},
{"stmtaddr", TDF_STMTADDR},
+ {"memsyms", TDF_MEMSYMS},
{"all", ~(TDF_RAW | TDF_SLIM | TDF_LINENO | TDF_TREE | TDF_RTL | TDF_IPA
| TDF_STMTADDR | TDF_GRAPH)},
{NULL, 0}
diff --git a/gcc/tree-flow-inline.h b/gcc/tree-flow-inline.h
index c1db6c55151..6056dea5f62 100644
--- a/gcc/tree-flow-inline.h
+++ b/gcc/tree-flow-inline.h
@@ -493,6 +493,13 @@ num_imm_uses (tree var)
return num;
}
+/* Return true if VAR has no immediate uses. */
+static inline bool
+zero_imm_uses_p (tree var)
+{
+ ssa_use_operand_t *ptr = &(SSA_NAME_IMM_USE_NODE (var));
+ return (ptr == ptr->next);
+}
/* Return the tree pointer to by USE. */
static inline tree
diff --git a/gcc/tree-flow.h b/gcc/tree-flow.h
index 8392c82f70e..6dbc4a6c11b 100644
--- a/gcc/tree-flow.h
+++ b/gcc/tree-flow.h
@@ -294,6 +294,10 @@ struct stmt_ann_d GTY(())
and local addressable variables. */
unsigned makes_clobbering_call : 1;
+ /* Nonzero if the statement references memory (at least one of its
+ expressions contains a non-register operand). */
+ unsigned references_memory : 1;
+
/* Basic block that contains this statement. */
basic_block bb;
@@ -647,9 +651,10 @@ extern tree default_def_fn (struct function *, tree);
/* In tree-phinodes.c */
extern void reserve_phi_args_for_new_edge (basic_block);
extern tree create_phi_node (tree, basic_block);
+extern tree create_factored_phi_node (tree, basic_block, bitmap);
extern void add_phi_arg (tree, tree, edge);
extern void remove_phi_args (edge);
-extern void remove_phi_node (tree, tree);
+extern void remove_phi_node (tree, tree, bool);
extern tree phi_reverse (tree);
/* In gimple-low.c */
diff --git a/gcc/tree-gimple.c b/gcc/tree-gimple.c
index cf0d92b390f..86d09b00a29 100644
--- a/gcc/tree-gimple.c
+++ b/gcc/tree-gimple.c
@@ -374,7 +374,7 @@ is_gimple_val (tree t)
/* FIXME make these decls. That can happen only when we expose the
entire landing-pad construct at the tree level. */
if (TREE_CODE (t) == EXC_PTR_EXPR || TREE_CODE (t) == FILTER_EXPR)
- return 1;
+ return true;
return (is_gimple_variable (t) || is_gimple_min_invariant (t));
}
diff --git a/gcc/tree-into-ssa.c b/gcc/tree-into-ssa.c
index b73622fcec7..ff4e08caac7 100644
--- a/gcc/tree-into-ssa.c
+++ b/gcc/tree-into-ssa.c
@@ -201,7 +201,7 @@ struct mark_def_sites_global_data
/* Information stored for SSA names. */
struct ssa_name_info
{
- /* The actual definition of the ssa name. */
+ /* The current reaching definition replacing this SSA name. */
tree current_def;
/* This field indicates whether or not the variable may need PHI nodes.
@@ -213,6 +213,13 @@ struct ssa_name_info
quicky); if AGE < CURRENT_INFO_FOR_SSA_NAME_AGE, then the fields
are assumed to be null. */
unsigned age;
+
+ /* For .MEM names, this is the set of symbols that are currently
+ reached by this name. This is used when rewriting the arguments
+ of factored PHI nodes in replace_factored_phi_argument. Do not
+ try to use it outside that function, as its contents are only
+ valid within that context. */
+ bitmap reached_syms;
};
/* The information associated with names. */
@@ -259,19 +266,61 @@ enum rewrite_mode {
DEF_VEC_P(bitmap);
DEF_VEC_ALLOC_P(bitmap,heap);
-/* Mapping between a statement and the symbols referenced by it. */
-struct mem_syms_map_d
+/* Array of sets of memory symbols modified by memory store operations
+ in each basic block. */
+static bitmap *syms_stored_in_bb;
+
+/* Array of sets of memory symbols that already contain a PHI node in
+ each basic block. */
+static bitmap *syms_with_phi_in_bb;
+
+/* Array of sets of memory symbols that need a factored PHI node in
+ each basic block. When inserting new factored PHI nodes, these
+ sets are the initial sets of symbols factored by those PHI nodes. */
+static bitmap *syms_to_factor_in_bb;
+
+/* When a factored PHI node P has arguments with multiple reaching
+ definitions it needs to be split into multiple PHI nodes to hold
+ the different reaching definitions. The problem is that the
+ sub-tree dominated by the block holding P may have already been
+ renamed. Some statements that are reached by P should really be
+ reached by one of the new PHI nodes split from P.
+
+ This problem would not exist if we could guarantee that PHI nodes
+ get their arguments filled in before their dominated sub-tree is
+ renamed. However, due to circular references created by loops, it
+ is generally not possible to guarantee this ordering.
+
+ We solve this problem by post-processing PHI nodes that have been
+ split. For every split PHI node P, we keep a list of PHI nodes
+ split from P. We then traverse the list of immediate uses for P
+ and determine whether they should be reached by one of P's children
+ instead. */
+struct unfactored_phis_d
{
- tree stmt;
- bitmap loaded;
- bitmap stored;
+ /* The PHI node that has been split. */
+ tree phi;
+
+ /* List of PHI nodes created to disambiguate arguments with multiple
+ reaching definitions. */
+ VEC(tree, heap) *children;
+
+ /* Next PHI in the list. */
+ struct unfactored_phis_d *next;
};
-typedef struct mem_syms_map_d *mem_syms_map_t;
+typedef struct unfactored_phis_d *unfactored_phis_t;
+
+static unfactored_phis_t first_unfactored_phi = NULL;
+static unfactored_phis_t last_unfactored_phi = NULL;
+static htab_t unfactored_phis = NULL;
+
+/* Last dominance number assigned to an SSA name. Dominance
+ numbers are used to order reaching definitions when fixing UD
+ chains for statements reached by split PHI nodes (see
+ fixup_unfactored_phis). */
+static unsigned int last_dom_num;
-/* Table, indexed by statement, for symbols loaded and stored by
- statements that reference memory. */
-static htab_t mem_syms_tbl;
/* Prototypes for debugging functions. */
extern void dump_tree_ssa (FILE *);
@@ -289,10 +338,16 @@ extern void dump_defs_stack (FILE *, int);
extern void debug_defs_stack (int);
extern void dump_currdefs (FILE *);
extern void debug_currdefs (void);
+extern void dump_stored_syms (FILE *);
+extern void debug_stored_syms (void);
+extern void dump_unfactored_phis (FILE *);
+extern void debug_unfactored_phis (void);
+extern void dump_unfactored_phi (FILE *, tree);
+extern void debug_unfactored_phi (tree);
/* Get the information associated with NAME. */
-static inline struct ssa_name_info *
+static inline ssa_name_info_p
get_ssa_name_ann (tree name)
{
unsigned ver = SSA_NAME_VERSION (name);
@@ -318,12 +373,14 @@ get_ssa_name_ann (tree name)
info->need_phi_state = 0;
info->current_def = NULL_TREE;
info->age = current_info_for_ssa_name_age;
+ info->reached_syms = NULL;
}
return info;
}
-/* Clears info for ssa names. */
+
+/* Clears info for SSA names. */
static void
clear_ssa_name_info (void)
@@ -331,7 +388,44 @@ clear_ssa_name_info (void)
current_info_for_ssa_name_age++;
}
-/* Gets phi_state field for VAR. */
+
+/* Return the dominance number associated with STMT. Dominance numbers
+ are computed during renaming. Given two statements S1 and S2, it
+ is guaranteed that if DOM_NUM (S2) > DOM_NUM (S1) then either S2
+ post-dominates S1 or S1 and S2 are on unrelated dominance
+ sub-trees. This property is used when post-processing split PHI
+ nodes after renaming (see fixup_unfactored_phis). */
+
+static unsigned int
+get_dom_num (tree stmt)
+{
+ return get_stmt_ann (stmt)->uid;
+}
+
+/* Likewise, but for SSA name NAME. */
+
+static unsigned int
+get_name_dom_num (tree name)
+{
+ tree def_stmt = SSA_NAME_DEF_STMT (name);
+
+ if (IS_EMPTY_STMT (def_stmt))
+ return 1;
+
+ return get_dom_num (def_stmt);
+}
+
+
+/* Assign the next dominance number to STMT. */
+
+static inline void
+set_next_dom_num (tree stmt)
+{
+ get_stmt_ann (stmt)->uid = last_dom_num++;
+}
+
+
+/* Get phi_state field for VAR. */
static inline enum need_phi_state
get_phi_state (tree var)
@@ -398,9 +492,7 @@ compute_global_livein (bitmap livein, bitmap def_blocks)
= (basic_block *) xmalloc (sizeof (basic_block) * (last_basic_block + 1));
EXECUTE_IF_SET_IN_BITMAP (livein, 0, i, bi)
- {
- *tos++ = BASIC_BLOCK (i);
- }
+ *tos++ = BASIC_BLOCK (i);
/* Iterate until the worklist is empty. */
while (tos != worklist)
@@ -617,6 +709,30 @@ repl_map_free (void *p)
}
+/* Hashing and equality functions for UNFACTORED_PHIS. */
+
+static hashval_t
+unfactored_phis_hash (const void *p)
+{
+ return htab_hash_pointer ((const void *)
+ ((const struct unfactored_phis_d *)p)->phi);
+}
+
+static int
+unfactored_phis_eq (const void *p1, const void *p2)
+{
+ return ((const struct unfactored_phis_d *)p1)->phi
+ == ((const struct unfactored_phis_d *)p2)->phi;
+}
+
+static void
+unfactored_phis_free (void *p)
+{
+ VEC_free (tree, heap, ((struct unfactored_phis_d *)p)->children);
+ free (p);
+}
+
+
/* Return the names replaced by NEW (i.e., REPL_TBL[NEW].SET). */
static inline bitmap
@@ -731,60 +847,65 @@ add_new_name_mapping (tree new, tree old)
}
-/* Hashing and equality functions for MEM_SYMS_TBL. */
+/* Add SYMS to the set of symbols with existing PHI nodes in basic
+ block TO. */
-static hashval_t
-mem_syms_hash (const void *p)
+static void
+add_syms_with_phi (bitmap syms, unsigned to)
{
- return htab_hash_pointer ((const void *)((const mem_syms_map_t)p)->stmt);
-}
+ if (syms_with_phi_in_bb[to] == NULL)
+ syms_with_phi_in_bb[to] = BITMAP_ALLOC (NULL);
-static int
-mem_syms_eq (const void *p1, const void *p2)
-{
- return ((const mem_syms_map_t)p1)->stmt == ((const mem_syms_map_t)p2)->stmt;
+ bitmap_ior_into (syms_with_phi_in_bb[to], syms);
+
+ /* For placing factored PHI nodes, we are only interested in
+ considering those symbols that are marked for renaming.
+ Otherwise, we will be placing unnecessary factored PHI nodes. */
+ if (!bitmap_empty_p (syms_to_rename))
+ bitmap_and_into (syms_with_phi_in_bb[to], syms_to_rename);
}
+
+/* Add SYM to the set of symbols with existing PHI nodes in basic
+ block TO. */
+
static void
-mem_syms_free (void *p)
+add_sym_with_phi (tree sym, unsigned to)
{
- BITMAP_FREE (((mem_syms_map_t)p)->loaded);
- BITMAP_FREE (((mem_syms_map_t)p)->stored);
- free (p);
+ if (syms_with_phi_in_bb[to] == NULL)
+ syms_with_phi_in_bb[to] = BITMAP_ALLOC (NULL);
+
+ bitmap_set_bit (syms_with_phi_in_bb[to], DECL_UID (sym));
}
-/* Return the memory symbols referenced by STMT. */
+/* Add SYMS to the stored symbols for basic block TO. */
-static inline mem_syms_map_t
-syms_referenced_by (tree stmt)
+static void
+add_stored_syms (bitmap syms, unsigned to)
{
- struct mem_syms_map_d m, *mp;
- void **slot;
+ if (syms_stored_in_bb[to] == NULL)
+ syms_stored_in_bb[to] = BITMAP_ALLOC (NULL);
- m.stmt = stmt;
- slot = htab_find_slot (mem_syms_tbl, (void *) &m, INSERT);
- if (*slot == NULL)
- {
- mp = XNEW (struct mem_syms_map_d);
- mp->stmt = stmt;
- mp->loaded = BITMAP_ALLOC (NULL);
- mp->stored = BITMAP_ALLOC (NULL);
+ bitmap_ior_into (syms_stored_in_bb[to], syms);
- get_loads_and_stores (stmt, mp->loaded, mp->stored);
+ /* For placing factored PHI nodes, we are only interested in
+ considering those symbols that are marked for renaming.
+ Otherwise, we will be placing unnecessary factored PHI nodes. */
+ if (!bitmap_empty_p (syms_to_rename))
+ bitmap_and_into (syms_stored_in_bb[to], syms_to_rename);
+}
- if (bitmap_empty_p (mp->loaded))
- BITMAP_FREE (mp->loaded);
- if (bitmap_empty_p (mp->stored))
- BITMAP_FREE (mp->stored);
+/* Add SYM to the stored symbols for basic block TO. */
- *slot = (void *) mp;
- }
- else
- mp = (mem_syms_map_t) *slot;
+static void
+add_stored_sym (tree sym, unsigned to)
+{
+ if (syms_stored_in_bb[to] == NULL)
+ syms_stored_in_bb[to] = BITMAP_ALLOC (NULL);
- return mp;
+ bitmap_set_bit (syms_stored_in_bb[to], DECL_UID (sym));
}
@@ -843,51 +964,6 @@ mark_def_sites (struct dom_walk_data *walk_data, basic_block bb,
REGISTER_DEFS_IN_THIS_STMT (stmt) = 1;
}
- /* If the statement has memory references, process the associated
- symbols. */
- if (stmt_references_memory_p (stmt))
- {
- mem_syms_map_t syms;
- unsigned i;
- bitmap_iterator bi;
-
- syms = syms_referenced_by (stmt);
-
- if (syms->loaded)
- {
- REWRITE_THIS_STMT (stmt) = 1;
- EXECUTE_IF_SET_IN_BITMAP (syms->loaded, 0, i, bi)
- {
- tree sym = referenced_var (i);
-
- /* Note that VDEF operators are never considered killing
- definitions, so memory symbols are always
- live-on-entry. */
- set_livein_block (sym, bb);
- }
- }
-
- if (syms->stored)
- {
- REGISTER_DEFS_IN_THIS_STMT (stmt) = 1;
- REWRITE_THIS_STMT (stmt) = 1;
-
- EXECUTE_IF_SET_IN_BITMAP (syms->stored, 0, i, bi)
- {
- tree sym = referenced_var (i);
- set_def_block (sym, bb, false);
-
- /* Note that virtual definitions are irrelevant for
- computing KILLS because a VDEF does not constitute a
- killing definition of the variable. However, the
- operand of a virtual definitions is a use of the
- variable, so it may cause the variable to be
- considered live-on-entry. */
- set_livein_block (sym, bb);
- }
- }
- }
-
/* If we found the statement interesting then also mark the block BB
as interesting. */
if (REWRITE_THIS_STMT (stmt) || REGISTER_DEFS_IN_THIS_STMT (stmt))
@@ -895,32 +971,32 @@ mark_def_sites (struct dom_walk_data *walk_data, basic_block bb,
}
-/* Given a set of blocks with variable definitions (DEF_BLOCKS),
- return a bitmap with all the blocks in the iterated dominance
- frontier of the blocks in DEF_BLOCKS. DFS contains dominance
- frontier information as returned by compute_dominance_frontiers.
+/* Given a set of blocks with variable definitions (DEF_BLOCKS) for
+ symbol VAR, return a bitmap with all the blocks in the iterated
+ dominance frontier of the blocks in DEF_BLOCKS. DFS contains
+ dominance frontier information as returned by
+ compute_dominance_frontiers.
The resulting set of blocks are the potential sites where PHI nodes
- are needed. The caller is responsible from freeing the memory
+ are needed. The caller is responsible for freeing the memory
allocated for the return value. */
static bitmap
-find_idf (bitmap def_blocks, bitmap *dfs)
+compute_idf (tree var, bitmap def_blocks, bitmap *dfs)
{
bitmap_iterator bi;
- unsigned bb_index;
+ unsigned bb_index, i;
VEC(int,heap) *work_stack;
bitmap phi_insertion_points;
work_stack = VEC_alloc (int, heap, n_basic_blocks);
phi_insertion_points = BITMAP_ALLOC (NULL);
- /* Seed the work list with all the blocks in DEF_BLOCKS. */
+ /* Seed the work list with all the blocks in DEF_BLOCKS. We use
+ VEC_quick_push here for speed. This is safe because we know that
+ the number of definition blocks is no greater than the number of
+ basic blocks, which is the initial capacity of WORK_STACK. */
EXECUTE_IF_SET_IN_BITMAP (def_blocks, 0, bb_index, bi)
- /* We use VEC_quick_push here for speed. This is safe because we
- know that the number of definition blocks is no greater than
- the number of basic blocks, which is the initial capacity of
- WORK_STACK. */
VEC_quick_push (int, work_stack, bb_index);
/* Pop a block off the worklist, add every block that appears in
@@ -939,14 +1015,53 @@ find_idf (bitmap def_blocks, bitmap *dfs)
we may pull a non-existing block from the work stack. */
gcc_assert (bb_index < (unsigned) last_basic_block);
+ /* When inserting PHI nodes for .MEM, we have to determine what
+ set of symbols should be initially associated to each PHI
+ node. For every block I in the dominance frontier of
+ BB_INDEX, we compute this initial set as the set of symbols
+ stored in BB_INDEX minus the set of symbols that already have
+ a PHI node in I:
+
+ SYMS_TO_FACTOR_IN_BB[I] += SYMS_STORED_IN_BB[BB_INDEX]
+ - SYMS_WITH_PHI_IN_BB[I]
+
+ We then add the resulting set to the set of symbols stored in
+ I, as the factored PHI node to be added will be a store
+ operation for this set. */
+ if (var == mem_var && syms_stored_in_bb[bb_index])
+ {
+ bitmap b = BITMAP_ALLOC (NULL);
+
+ EXECUTE_IF_SET_IN_BITMAP (dfs[bb_index], 0, i, bi)
+ {
+ if (syms_to_factor_in_bb[i] == NULL)
+ syms_to_factor_in_bb[i] = BITMAP_ALLOC (NULL);
+
+ if (syms_with_phi_in_bb[i])
+ {
+ bitmap_and_compl (b,
+ syms_stored_in_bb[bb_index],
+ syms_with_phi_in_bb[i]);
+ bitmap_ior_into (syms_to_factor_in_bb[i], b);
+ }
+ else
+ bitmap_ior_into (syms_to_factor_in_bb[i],
+ syms_stored_in_bb[bb_index]);
+
+ add_stored_syms (syms_to_factor_in_bb[i], i);
+ }
+
+ BITMAP_FREE (b);
+ }
+
EXECUTE_IF_AND_COMPL_IN_BITMAP (dfs[bb_index], phi_insertion_points,
- 0, bb_index, bi)
+ 0, i, bi)
{
/* Use a safe push because if there is a definition of VAR
in every basic block, then WORK_STACK may eventually have
more than N_BASIC_BLOCK entries. */
- VEC_safe_push (int, heap, work_stack, bb_index);
- bitmap_set_bit (phi_insertion_points, bb_index);
+ VEC_safe_push (int, heap, work_stack, i);
+ bitmap_set_bit (phi_insertion_points, i);
}
}
@@ -996,6 +1111,7 @@ mark_phi_for_rewrite (basic_block bb, tree phi)
if (REWRITE_THIS_STMT (phi))
return;
+
REWRITE_THIS_STMT (phi) = 1;
if (!blocks_with_phis_to_rewrite)
@@ -1014,12 +1130,12 @@ mark_phi_for_rewrite (basic_block bb, tree phi)
VEC_replace (tree_vec, phis_to_rewrite, idx, phis);
}
+
/* Insert PHI nodes for variable VAR using the iterated dominance
frontier given in PHI_INSERTION_POINTS. If UPDATE_P is true, this
- function assumes that the caller is incrementally updating the SSA
- form, in which case (1) VAR is assumed to be an SSA name, (2) a new
- SSA name is created for VAR's symbol, and, (3) all the arguments
- for the newly created PHI node are set to VAR.
+ function assumes that the caller is incrementally updating the
+ existing SSA form, in which case VAR may be an SSA name instead of
+ a symbol.
PHI_INSERTION_POINTS is updated to reflect nodes that already had a
PHI node for VAR. On exit, only the nodes that received a PHI node
@@ -1041,8 +1157,8 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
/* Remove the blocks where we already have PHI nodes for VAR. */
bitmap_and_compl_into (phi_insertion_points, def_map->phi_blocks);
- /* Now compute global livein for this variable. Note this modifies
- def_map->livein_blocks. */
+ /* Now compute global livein for this variable. Note that this
+ modifies DEF_MAP->LIVEIN_BLOCKS. */
compute_global_livein (def_map->livein_blocks, def_map->def_blocks);
/* And insert the PHI nodes. */
@@ -1053,7 +1169,9 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
if (update_p)
mark_block_for_update (bb);
- if (update_p && TREE_CODE (var) == SSA_NAME)
+ phi = NULL_TREE;
+
+ if (TREE_CODE (var) == SSA_NAME)
{
/* If we are rewriting SSA names, create the LHS of the PHI
node by duplicating VAR. This is useful in the case of
@@ -1062,7 +1180,12 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
edge_iterator ei;
tree new_lhs;
- phi = create_phi_node (var, bb);
+ gcc_assert (update_p);
+ if (SSA_NAME_VAR (var) == mem_var)
+ phi = create_factored_phi_node (var, bb, NULL);
+ else
+ phi = create_phi_node (var, bb);
+
new_lhs = duplicate_ssa_name (var, phi);
SET_PHI_RESULT (phi, new_lhs);
add_new_name_mapping (new_lhs, var);
@@ -1076,13 +1199,26 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
FOR_EACH_EDGE (e, ei, bb->preds)
add_phi_arg (phi, var, e);
}
- else
+ else if (var != mem_var)
{
tree sym = DECL_P (var) ? var : SSA_NAME_VAR (var);
phi = create_phi_node (sym, bb);
}
+ else
+ {
+ /* Initially, a factored PHI node in block BB is associated
+ with all the symbols that need to be factored in BB (as
+ computed in prepare_block_for_update and find_idf). */
+ bitmap syms = syms_to_factor_in_bb[bb->index];
+ if (bitmap_singleton_p (syms))
+ {
+ tree sym = referenced_var_lookup (bitmap_first_set_bit (syms));
+ phi = create_phi_node (sym, bb);
+ }
+ else
+ phi = create_factored_phi_node (mem_var, bb, syms);
+ }
- /* Mark this PHI node as interesting for update_ssa. */
REGISTER_DEFS_IN_THIS_STMT (phi) = 1;
mark_phi_for_rewrite (bb, phi);
}
@@ -1091,10 +1227,7 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
/* Insert PHI nodes at the dominance frontier of blocks with variable
definitions. DFS contains the dominance frontier information for
- the flowgraph. PHI nodes will only be inserted at the dominance
- frontier of definition blocks for variables whose NEED_PHI_STATE
- annotation is marked as ``maybe'' or ``unknown'' (computed by
- mark_def_sites). */
+ the flowgraph. */
static void
insert_phi_nodes (bitmap *dfs)
@@ -1115,7 +1248,7 @@ insert_phi_nodes (bitmap *dfs)
if (get_phi_state (var) != NEED_PHI_STATE_NO)
{
- idf = find_idf (def_map->def_blocks, dfs);
+ idf = compute_idf (var, def_map->def_blocks, dfs);
insert_phi_nodes_for (var, idf, false);
BITMAP_FREE (idf);
}
@@ -1215,10 +1348,7 @@ rewrite_initialize_block (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
{
tree result = PHI_RESULT (phi);
- /* FIXME. For memory symbols, this will be .MEM, but we should
- determine what symbol is associated with this particular PHI
- function. */
- gcc_assert (SSA_NAME_VAR (result) != mem_var);
+ gcc_assert (is_gimple_reg (result));
register_new_def (result, SSA_NAME_VAR (result));
}
}
@@ -1253,109 +1383,6 @@ get_reaching_def (tree var)
}
-/* Rewrite memory references in STMT. */
-
-static void
-rewrite_memory_stmt (tree stmt)
-{
- unsigned i;
- bitmap rdefs;
- bitmap_iterator bi;
- mem_syms_map_t syms;
-
- syms = syms_referenced_by (stmt);
-
- /* If the statement makes no loads or stores, it must have volatile
- operands. */
- if (syms->loaded == NULL && syms->stored == NULL)
- {
- gcc_assert (stmt_ann (stmt)->has_volatile_ops);
- return;
- }
-
- rdefs = BITMAP_ALLOC (NULL);
-
- /* Rewrite loaded symbols. */
- if (syms->loaded)
- {
- size_t j;
- struct vuse_optype_d *vuses;
-
- /* If STMT is a load, it should have exactly one VUSE operator. */
- vuses = VUSE_OPS (stmt);
- gcc_assert (vuses && vuses->next == NULL);
-
- /* Collect all the reaching definitions for symbols loaded by STMT. */
- bitmap_clear (rdefs);
- EXECUTE_IF_SET_IN_BITMAP (syms->loaded, 0, i, bi)
- {
- tree sym = referenced_var (i);
- tree rdef = get_reaching_def (sym);
- bitmap_set_bit (rdefs, SSA_NAME_VERSION (rdef));
- }
-
- /* Rewrite the VUSE as VUSE <{RDEFS}>. */
- vuses = realloc_vuse (vuses, bitmap_count_bits (rdefs));
- j = 0;
- EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
- SET_USE (VUSE_OP_PTR (vuses, j++), ssa_name (i));
-
- vuses = vuses->next;
- }
-
- /* Rewrite stored symbols. */
- if (syms->stored)
- {
- tree lhs, new_name;
- size_t j;
- struct vdef_optype_d *vdefs;
-
- /* If STMT is a store, it should have exactly one VDEF operator. */
- vdefs = VDEF_OPS (stmt);
- gcc_assert (vdefs && vdefs->next == NULL);
-
- /* Collect all the current reaching definitions for symbols in STORES. */
- bitmap_clear (rdefs);
- EXECUTE_IF_SET_IN_BITMAP (syms->stored, 0, i, bi)
- {
- tree sym = referenced_var (i);
- tree rdef = get_reaching_def (sym);
- bitmap_set_bit (rdefs, SSA_NAME_VERSION (rdef));
- }
-
- /* Create a new name for the LHS. If there is a single symbol
- in STORES, use it as the target of the VDEF. Otherwise
- factor all the stored symbols into .MEM. */
- if (bitmap_count_bits (syms->stored) == 1)
- lhs = referenced_var (bitmap_first_set_bit (syms->stored));
- else
- {
- lhs = VDEF_RESULT (vdefs);
- gcc_assert (lhs == mem_var);
- }
-
- new_name = make_ssa_name (lhs, stmt);
-
- /* Set NEW_NAME to be the current reaching definition for every
- symbol loaded by STMT. */
- EXECUTE_IF_SET_IN_BITMAP (syms->stored, 0, i, bi)
- {
- tree sym = referenced_var (i);
- register_new_def (new_name, sym);
- }
-
- /* Rewrite the VDEF as NEW_NAME = VDEF <{RDEFS}>. */
- vdefs = realloc_vdef (vdefs, bitmap_count_bits (rdefs));
- SET_DEF (VDEF_RESULT_PTR (vdefs), new_name);
- j = 0;
- EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
- SET_USE (VDEF_OP_PTR (vdefs, j++), ssa_name (i));
- }
-
- BITMAP_FREE (rdefs);
-}
-
-
/* SSA Rewriting Step 2. Rewrite every variable used in each statement in
the block with its immediate reaching definitions. Update the current
definition of a variable when a new real or virtual definition is found. */
@@ -1402,11 +1429,6 @@ rewrite_stmt (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
SET_DEF (def_p, make_ssa_name (var, stmt));
register_new_def (DEF_FROM_PTR (def_p), var);
}
-
- /* Rewrite virtual operands for statements that make memory
- references. */
- if (stmt_references_memory_p (stmt))
- rewrite_memory_stmt (stmt);
}
@@ -1436,8 +1458,8 @@ rewrite_add_phi_arguments (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
}
-/* Called after visiting basic block BB. Restore CURRDEFS to its
- original value. */
+/* Called after visiting all the statements in basic block BB and all
+ of its dominator children. Restore CURRDEFS to its original value. */
static void
rewrite_finalize_block (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
@@ -1480,6 +1502,40 @@ rewrite_finalize_block (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
}
+/* Dump bitmap SET (assumed to contain VAR_DECLs) to FILE. */
+
+void
+dump_decl_set (FILE *file, bitmap set)
+{
+ if (set)
+ {
+ bitmap_iterator bi;
+ unsigned i;
+
+ fprintf (file, "{ ");
+
+ EXECUTE_IF_SET_IN_BITMAP (set, 0, i, bi)
+ {
+ print_generic_expr (file, referenced_var (i), 0);
+ fprintf (file, " ");
+ }
+
+ fprintf (file, "}\n");
+ }
+ else
+ fprintf (file, "NIL\n");
+}
+
+
+/* Dump bitmap SET (assumed to contain VAR_DECLs) to FILE. */
+
+void
+debug_decl_set (bitmap set)
+{
+ dump_decl_set (stderr, set);
+}
+
+
/* Dump the renaming stack (block_defs_stack) to FILE. Traverse the
stack up to a maximum of N levels. If N is -1, the whole stack is
dumped. New levels are created when the dominator tree traversal
@@ -1560,18 +1616,20 @@ dump_currdefs (FILE *file)
fprintf (file, "\n\nCurrent reaching definitions\n\n");
FOR_EACH_REFERENCED_VAR (var, i)
- {
- fprintf (file, "CURRDEF (");
- print_generic_expr (file, var, 0);
- fprintf (file, ") = ");
- if (get_current_def (var))
- print_generic_expr (file, get_current_def (var), 0);
- else
- fprintf (file, "<NIL>");
- fprintf (file, "\n");
- }
+ if (syms_to_rename == NULL || bitmap_bit_p (syms_to_rename, DECL_UID (var)))
+ {
+ fprintf (file, "CURRDEF (");
+ print_generic_expr (file, var, 0);
+ fprintf (file, ") = ");
+ if (get_current_def (var))
+ print_generic_expr (file, get_current_def (var), 0);
+ else
+ fprintf (file, "<NIL>");
+ fprintf (file, "\n");
+ }
}
+
/* Dump the current reaching definition of every symbol to stderr. */
void
@@ -1581,6 +1639,98 @@ debug_currdefs (void)
}
+/* Dump symbols stored in each basic block on FILE. */
+
+void
+dump_stored_syms (FILE *file)
+{
+ basic_block bb;
+
+ if (syms_stored_in_bb == NULL
+ && syms_with_phi_in_bb == NULL
+ && syms_to_factor_in_bb == NULL)
+ return;
+
+ fprintf (file, "\n\nMemory symbols stored in each block\n\n");
+ FOR_EACH_BB (bb)
+ {
+ bool newline_p = false;
+
+ if (syms_stored_in_bb && syms_stored_in_bb[bb->index])
+ {
+ fprintf (file, "SYMS_STORED_IN_BB[%d] = ", bb->index);
+ dump_decl_set (file, syms_stored_in_bb[bb->index]);
+ newline_p = true;
+ }
+
+ if (syms_with_phi_in_bb && syms_with_phi_in_bb[bb->index])
+ {
+ fprintf (file, "SYMS_WITH_PHI_IN_BB[%d] = ", bb->index);
+ dump_decl_set (file, syms_with_phi_in_bb[bb->index]);
+ newline_p = true;
+ }
+
+ if (syms_to_factor_in_bb && syms_to_factor_in_bb[bb->index])
+ {
+ fprintf (file, "SYMS_TO_FACTOR_IN_BB[%d] = ", bb->index);
+ dump_decl_set (file, syms_to_factor_in_bb[bb->index]);
+ newline_p = true;
+ }
+
+ if (newline_p)
+ fprintf (file, "\n");
+ }
+}
+
+
+/* Dump symbols stored in each basic block on stderr. */
+
+void
+debug_stored_syms (void)
+{
+ dump_stored_syms (stderr);
+}
+
+
+/* Dump unfactored PHI node PHI to stderr. */
+
+void
+debug_unfactored_phi (tree phi)
+{
+ dump_unfactored_phi (stderr, phi);
+}
+
+
+/* Dump the list of unfactored PHIs to FILE. */
+
+void
+dump_unfactored_phis (FILE *file)
+{
+ struct unfactored_phis_d *n;
+ unsigned i;
+
+ if (unfactored_phis == NULL)
+ return;
+
+ fprintf (file, "\n\nUnfactored PHI nodes\n\n");
+
+ for (i = 0, n = first_unfactored_phi; n; n = n->next, i++)
+ {
+ fprintf (file, "#%d: ", i);
+ dump_unfactored_phi (file, n->phi);
+ }
+}
+
+
+/* Dump the list of unfactored PHIs to stderr. */
+
+void
+debug_unfactored_phis (void)
+{
+ dump_unfactored_phis (stderr);
+}
+
+
/* Dump SSA information to FILE. */
void
@@ -1594,6 +1744,9 @@ dump_tree_ssa (FILE *file)
dump_def_blocks (file);
dump_defs_stack (file, -1);
dump_currdefs (file);
+ dump_stored_syms (file);
+ dump_unfactored_phis (file);
+ dump_tree_ssa_stats (file);
}
@@ -1623,12 +1776,23 @@ htab_statistics (FILE *file, htab_t htab)
void
dump_tree_ssa_stats (FILE *file)
{
- fprintf (file, "\nHash table statistics:\n");
+ if (def_blocks || repl_tbl)
+ fprintf (file, "\nHash table statistics:\n");
+
+ if (def_blocks)
+ {
+ fprintf (file, " def_blocks: ");
+ htab_statistics (file, def_blocks);
+ }
- fprintf (file, " def_blocks: ");
- htab_statistics (file, def_blocks);
+ if (repl_tbl)
+ {
+ fprintf (file, " repl_tbl: ");
+ htab_statistics (file, repl_tbl);
+ }
- fprintf (file, "\n");
+ if (def_blocks || repl_tbl)
+ fprintf (file, "\n");
}
@@ -1785,10 +1949,12 @@ rewrite_update_init_block (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
register it as a new definition for its corresponding name. Also
register definitions for names whose underlying symbols are
marked for renaming. */
-
for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
{
tree lhs, lhs_sym;
+ bool old_p, new_p;
+
+ set_next_dom_num (phi);
if (!REGISTER_DEFS_IN_THIS_STMT (phi))
continue;
@@ -1796,22 +1962,38 @@ rewrite_update_init_block (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
lhs = PHI_RESULT (phi);
lhs_sym = SSA_NAME_VAR (lhs);
- if (symbol_marked_for_renaming (lhs_sym))
- /* If LHS is .MEM here, we should know which symbol is this
- .MEM a new name for. */
- register_new_update_single (lhs, lhs_sym);
- else
+ new_p = is_new_name (lhs);
+ old_p = is_old_name (lhs);
+ if (new_p || old_p)
{
/* If LHS is a new name, register a new definition for all
the names replaced by LHS. */
- if (is_new_name (lhs))
+ if (new_p)
register_new_update_set (lhs, names_replaced_by (lhs));
/* If LHS is an OLD name, register it as a new definition
for itself. */
- if (is_old_name (lhs))
+ if (old_p)
register_new_update_single (lhs, lhs);
}
+ else if (lhs_sym == mem_var && !bitmap_empty_p (syms_to_rename))
+ {
+ /* If LHS is a name for .MEM, then PHI becomes the current
+ reaching definition for all the symbols factored in it. */
+ bitmap_iterator bi;
+ unsigned i;
+ bitmap syms;
+
+ syms = get_loads_and_stores (phi)->stores;
+ EXECUTE_IF_AND_IN_BITMAP (syms, syms_to_rename, 0, i, bi)
+ register_new_update_single (lhs, referenced_var (i));
+ }
+ else if (symbol_marked_for_renaming (lhs_sym))
+ {
+ /* If LHS is a regular symbol marked for renaming, register
+ LHS as its current reaching definition. */
+ register_new_update_single (lhs, lhs_sym);
+ }
if (is_abnormal_phi)
SSA_NAME_OCCURS_IN_ABNORMAL_PHI (lhs) = 1;
@@ -1961,141 +2143,47 @@ stale_ssa_name_p (tree n)
}
-/* Preserve in RDEFS all the names from the virtual operands in STMT
- that represent the symbols in UNMARKED_SYMS. WHICH_VOPS indicates
- what virtual operator to process.
-
- At this point, RDEFS contains the reaching definitions for all the
- symbols marked for renaming in SYMS. However, there may be some
- symbols in SYMS that had not been marked for renaming (i.e., those
- collected in UNMARKED_SYMS).
-
- We need to match those symbols in UNMARKED_SYMS to existing SSA
- names in the virtual operands for STMT. Otherwise, we will remove
- use-def chains for symbols that had not been marked for renaming.
-
- Notice that we need to do all this maneuvering because the SSA
- names for .MEM in the operand list may need to be removed when they
- don't represent symbols in this statement anymore.
-
- After this, names in WHICH_VOPS that have not been added to RDEFS
- will be discarded. */
+/* Given a set of SSA names (RDEFS), add all the names in the set as
+ operands to the virtual operator WHICH_VOP for statement STMT. */
static void
-preserve_needed_names_in_vops (tree stmt, bitmap unmarked_syms,
- bitmap rdefs, int which_vops)
+rewrite_vops (tree stmt, bitmap rdefs, int which_vops)
{
- tree name, sym;
- use_operand_p use_p;
- ssa_op_iter iter;
- VEC(tree,heap) *mem_names_stack = NULL;
-
- /* We first need to match SSA names for individual symbols to
- avoid the following problem:
-
- 1 # .MEM_3 = VDEF <a_5, b_1, c_9>
- 2 *p_8 = ...
- 3 ...
- 4 # a_7 = VDEF <.MEM_3>
- 5 a = ...
- 6 ...
- 7 # VUSE <.MEM_3, a_7>
- 8 ... = a
-
- Suppose that we are only renaming 'a'. Since .MEM_3 was used
- to factor a store to 'a' and 'b', if we process .MEM_3 first,
- we will decide to keep the .MEM_3 operand because 'a' is one of its
- factored symbols. However, the definition of 'a_7' now supercedes
- the definition for .MEM_3, and so, .MEM_3 needs to be discarded.
-
- Doing this two-stage matching is correct because the existing
- SSA web guarantees that the individual name 'a_7'
- post-dominates the factored name '.MEM_3'. If there was
- another factored store in the path, then we would have
- another factored .MEM name reaching the load from 'a' at line 8. */
- FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, which_vops)
- {
- name = USE_FROM_PTR (use_p);
- sym = DECL_P (name) ? name : SSA_NAME_VAR (name);
-
- if (TREE_CODE (name) == SSA_NAME && !name_marked_for_release_p (name))
- {
- if (sym == mem_var)
- {
- /* Save .MEM names to be processed after individual
- names. */
- VEC_safe_push (tree, heap, mem_names_stack, name);
- }
- else if (bitmap_bit_p (unmarked_syms, DECL_UID (sym)))
- {
- /* Otherwise, if SYM is in UNMARKED_SYMS and NAME
- has not been marked to be released, add NAME to
- RDEFS. */
- bitmap_set_bit (rdefs, SSA_NAME_VERSION (name));
- bitmap_clear_bit (unmarked_syms, DECL_UID (sym));
- }
- }
- }
+ unsigned num_rdefs, i, j;
+ bitmap_iterator bi;
- /* If we still have unmarked symbols, match them to the .MEM name(s) */
- if (!bitmap_empty_p (unmarked_syms))
+ num_rdefs = bitmap_count_bits (rdefs);
+ if (which_vops == SSA_OP_VUSE)
{
- mem_syms_map_t syms;
- bool found_default_mem_name_p = false;
- tree mem_default_def = default_def (mem_var);
- bitmap tmp = BITMAP_ALLOC (NULL);
- bitmap matched_syms = BITMAP_ALLOC (NULL);
-
- while (VEC_length (tree, mem_names_stack) > 0)
- {
- name = VEC_pop (tree, mem_names_stack);
+ /* STMT should have exactly one VUSE operator. */
+ struct vuse_optype_d *vuses = VUSE_OPS (stmt);
+ gcc_assert (vuses && vuses->next == NULL);
- if (name == mem_default_def)
- {
- /* The default definition for .MEM is special because it
- matches every memory symbol. However, we only want
- to use it as a last resort (i.e., if no other
- existing SSA name matches). */
- found_default_mem_name_p = true;
- continue;
- }
+ vuses = realloc_vuse (vuses, num_rdefs);
+ j = 0;
+ EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
+ SET_USE (VUSE_OP_PTR (vuses, j++), ssa_name (i));
+ }
+ else
+ {
+ tree lhs;
+ struct vdef_optype_d *vdefs;
- /* If this .MEM name matches some of the symbols in
- UNMARKED_SYMS, add it to RDEFS. */
- syms = syms_referenced_by (SSA_NAME_DEF_STMT (name));
- if (bitmap_intersect_p (unmarked_syms, syms->stored))
- {
- /* Remove from UNMARKED_SYMS the symbols that are common
- between UNMARKED_SYMS and SYMS->STORED. If there were
- symbols in common, add NAME to RDEFS. */
- bitmap_set_bit (rdefs, SSA_NAME_VERSION (name));
-
- /* Add the matched symbols to MATCHED_SYMS. */
- bitmap_and (tmp, unmarked_syms, syms->stored);
- bitmap_ior_into (matched_syms, tmp);
- }
- }
+ /* STMT should have exactly one VDEF operator. */
+ vdefs = VDEF_OPS (stmt);
+ gcc_assert (vdefs && vdefs->next == NULL);
- /* Remove all the matched symbols from UNMARKED_SYMS, if all the
- unmarked symbols were matched, then we can discard the
- default definition for .MEM (if we found one). */
- bitmap_and_compl_into (unmarked_syms, matched_syms);
+ /* Preserve the existing LHS to avoid creating SSA names
+ unnecessarily. */
+ lhs = VDEF_RESULT (vdefs);
- /* The default definition for .MEM matches all the symbols that
- could not be matched to other names in the list of operands. */
- if (found_default_mem_name_p && !bitmap_empty_p (unmarked_syms))
- {
- bitmap_set_bit (rdefs, SSA_NAME_VERSION (mem_default_def));
- bitmap_clear (unmarked_syms);
- }
+ vdefs = realloc_vdef (vdefs, num_rdefs);
+ j = 0;
+ EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
+ SET_USE (VDEF_OP_PTR (vdefs, j++), ssa_name (i));
- BITMAP_FREE (tmp);
- BITMAP_FREE (matched_syms);
+ SET_DEF (VDEF_RESULT_PTR (vdefs), lhs);
}
-
- /* We should have found matches for every symbol in UNMARKED_SYMS. */
- gcc_assert (unmarked_syms == NULL || bitmap_empty_p (unmarked_syms));
- VEC_free (tree, heap, mem_names_stack);
}
@@ -2114,16 +2202,14 @@ preserve_needed_names_in_vops (tree stmt, bitmap unmarked_syms,
static void
rewrite_update_stmt_vops (tree stmt, bitmap syms, bitmap rdefs, int which_vops)
{
- unsigned i, j, num_rdefs;
+ unsigned i;
bitmap_iterator bi;
- bool all_marked_p;
bitmap unmarked_syms = NULL;
gcc_assert (which_vops == SSA_OP_VUSE || which_vops == SSA_OP_VMAYUSE);
/* Collect all the reaching definitions for symbols marked for
renaming in SYMS. */
- all_marked_p = true;
EXECUTE_IF_SET_IN_BITMAP (syms, 0, i, bi)
{
tree sym = referenced_var (i);
@@ -2134,8 +2220,6 @@ rewrite_update_stmt_vops (tree stmt, bitmap syms, bitmap rdefs, int which_vops)
}
else
{
- all_marked_p = false;
-
/* Add SYM to UNMARKED_SYMS so that they can be matched to
existing SSA names in WHICH_VOPS. */
if (unmarked_syms == NULL)
@@ -2146,45 +2230,70 @@ rewrite_update_stmt_vops (tree stmt, bitmap syms, bitmap rdefs, int which_vops)
/* Preserve names from VOPS that are needed for the symbols that
have not been marked for renaming. */
- if (!all_marked_p)
+ if (unmarked_syms)
{
- preserve_needed_names_in_vops (stmt, unmarked_syms, rdefs, which_vops);
- BITMAP_FREE (unmarked_syms);
- }
+ tree name;
+ use_operand_p use_p;
+ ssa_op_iter iter;
+ bitmap old_rdefs;
+ unsigned i;
+ bitmap_iterator bi;
- /* Rewrite the appropriate virtual operand setting its RHS to RDEFS. */
- num_rdefs = bitmap_count_bits (rdefs);
- if (which_vops == SSA_OP_VUSE)
- {
- /* STMT should have exactly one VUSE operator. */
- struct vuse_optype_d *vuses = VUSE_OPS (stmt);
- gcc_assert (vuses && vuses->next == NULL);
+ old_rdefs = BITMAP_ALLOC (NULL);
+ FOR_EACH_SSA_USE_OPERAND (use_p, stmt, iter, which_vops)
+ {
+ name = USE_FROM_PTR (use_p);
+ bitmap_set_bit (old_rdefs, SSA_NAME_VERSION (name));
+ }
- vuses = realloc_vuse (vuses, num_rdefs);
- j = 0;
- EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
- SET_USE (VUSE_OP_PTR (vuses, j++), ssa_name (i));
- }
- else
- {
- tree lhs;
- struct vdef_optype_d *vdefs;
+ bitmap_and_compl_into (old_rdefs, rdefs);
- /* STMT should have exactly one VDEF operator. */
- vdefs = VDEF_OPS (stmt);
- gcc_assert (vdefs && vdefs->next == NULL);
+ EXECUTE_IF_SET_IN_BITMAP (old_rdefs, 0, i, bi)
+ {
+ tree name = ssa_name (i);
- /* Preserve the existing LHS to avoid creating SSA names
- unnecessarily. */
- lhs = VDEF_RESULT (vdefs);
+ /* 1- Names in OLD_RDEFS that are marked for release or
+ stale are discarded.
- vdefs = realloc_vdef (vdefs, num_rdefs);
- j = 0;
- EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
- SET_USE (VDEF_OP_PTR (vdefs, j++), ssa_name (i));
+ 2- .MEM's default definition is always kept.
- SET_DEF (VDEF_RESULT_PTR (vdefs), lhs);
+ 3- If a name in OLD_RDEFS only matches symbols that have been
+ marked for renaming, then those symbols have already been
+ matched above by their current reaching definition
+ (i.e., by one of the names in RDEFS), therefore they
+ need to be discarded. */
+ if (name_marked_for_release_p (name) || stale_ssa_name_p (name))
+ continue;
+ else if (name == default_def (mem_var))
+ bitmap_set_bit (rdefs, i);
+ else
+ {
+ bitmap syms;
+ syms = get_loads_and_stores (SSA_NAME_DEF_STMT (name))->stores;
+
+ if (bitmap_intersect_p (syms, syms_to_rename)
+ && !bitmap_intersect_p (syms, unmarked_syms))
+ {
+ /* If NAME factors symbols marked for renaming but
+ it does not factor any symbols in UNMARKED_SYMS,
+ then it is not needed because a different name is
+ now the reaching definition for those symbols. */
+ continue;
+ }
+ else
+ {
+ /* Otherwise, NAME must be factoring one of the
+ unmarked symbols. Leave it. */
+ bitmap_set_bit (rdefs, i);
+ }
+ }
+ }
+
+ BITMAP_FREE (unmarked_syms);
}
+
+ /* Rewrite the appropriate virtual operand setting its RHS to RDEFS. */
+ rewrite_vops (stmt, rdefs, which_vops);
}
@@ -2208,7 +2317,7 @@ register_new_vdef_name (tree stmt, bitmap stores)
/* If there is a single symbol in STORES, use it as the target
of the VDEF. Otherwise factor all the stored symbols into
.MEM. */
- if (bitmap_count_bits (stores) == 1)
+ if (bitmap_singleton_p (stores))
lhs = referenced_var (bitmap_first_set_bit (stores));
else
lhs = mem_var;
@@ -2223,7 +2332,7 @@ register_new_vdef_name (tree stmt, bitmap stores)
Otherwise, create a new SSA name for it. */
tree new_lhs_sym;
- if (bitmap_count_bits (stores) == 1)
+ if (bitmap_singleton_p (stores))
new_lhs_sym = referenced_var (bitmap_first_set_bit (stores));
else
new_lhs_sym = mem_var;
@@ -2234,9 +2343,7 @@ register_new_vdef_name (tree stmt, bitmap stores)
{
/* Create a new SSA name for the LHS and mark the original
LHS stale. This will prevent rewrite_update_stmt_vops
- from keeping LHS in statements that still use it. FIXME,
- this does not help statements that are never visited by
- update_ssa. */
+ from keeping LHS in statements that still use it. */
new_name = make_ssa_name (new_lhs_sym, stmt);
mark_ssa_name_stale (lhs);
}
@@ -2254,41 +2361,42 @@ register_new_vdef_name (tree stmt, bitmap stores)
}
-/* Update every SSA memory reference in STMT. */
+/* Update every SSA memory reference in STMT. If SET_CURRDEF_P is
+ false, no new definitions will be registered for store operations.
+ This is used when post-processing unfactored PHI nodes in
+ fixup_unfactored_phis. */
static void
-rewrite_update_memory_stmt (tree stmt)
+rewrite_update_memory_stmt (tree stmt, bool set_currdef_p)
{
bitmap rdefs;
mem_syms_map_t syms;
- syms = syms_referenced_by (stmt);
+ syms = get_loads_and_stores (stmt);
- /* If the statement makes no loads or stores, it must have volatile
- operands. */
- if (syms->loaded == NULL && syms->stored == NULL)
- {
- gcc_assert (stmt_ann (stmt)->has_volatile_ops);
- return;
- }
+ if (syms->loads == NULL && syms->stores == NULL)
+ return;
rdefs = BITMAP_ALLOC (NULL);
/* Rewrite loaded symbols marked for renaming. */
- if (syms->loaded)
+ if (syms->loads)
{
- rewrite_update_stmt_vops (stmt, syms->loaded, rdefs, SSA_OP_VUSE);
+ rewrite_update_stmt_vops (stmt, syms->loads, rdefs, SSA_OP_VUSE);
bitmap_clear (rdefs);
}
- if (syms->stored)
+ if (syms->stores)
{
/* Rewrite stored symbols marked for renaming. */
- rewrite_update_stmt_vops (stmt, syms->stored, rdefs, SSA_OP_VMAYUSE);
+ rewrite_update_stmt_vops (stmt, syms->stores, rdefs, SSA_OP_VMAYUSE);
- /* Register the LHS of the VDEF to be the new reaching
- definition of all the symbols in STORES. */
- register_new_vdef_name (stmt, syms->stored);
+ if (set_currdef_p)
+ {
+ /* Register the LHS of the VDEF to be the new reaching
+ definition of all the symbols in STORES. */
+ register_new_vdef_name (stmt, syms->stores);
+ }
}
BITMAP_FREE (rdefs);
@@ -2318,6 +2426,8 @@ rewrite_update_stmt (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
gcc_assert (bitmap_bit_p (blocks_to_update, bb->index));
+ set_next_dom_num (stmt);
+
/* Only update marked statements. */
if (!REWRITE_THIS_STMT (stmt) && !REGISTER_DEFS_IN_THIS_STMT (stmt))
return;
@@ -2333,7 +2443,7 @@ rewrite_update_stmt (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
if (need_to_update_vops_p
&& stmt_references_memory_p (stmt)
&& !bitmap_empty_p (syms_to_rename))
- rewrite_update_memory_stmt (stmt);
+ rewrite_update_memory_stmt (stmt, true);
/* Rewrite USES included in OLD_SSA_NAMES and USES whose underlying
symbol is marked for renaming. */
@@ -2362,6 +2472,333 @@ replace_use (use_operand_p use_p, tree use)
}
+/* Add symbol UID to the set of symbols reached by SSA name NAME. */
+
+static void
+add_reached_sym (tree name, unsigned uid)
+{
+ ssa_name_info_p ann = get_ssa_name_ann (name);
+
+ if (ann->reached_syms == NULL)
+ ann->reached_syms = BITMAP_ALLOC (NULL);
+ bitmap_set_bit (ann->reached_syms, uid);
+}
+
+
+/* Lookup PHI node PHI in the table of unfactored PHI nodes. Return
+ NULL if PHI is not in the table. */
+
+static unfactored_phis_t
+lookup_unfactored_phi (tree phi)
+{
+ struct unfactored_phis_d up;
+ void **slot;
+
+ if (unfactored_phis == NULL)
+ return NULL;
+
+ up.phi = phi;
+ slot = htab_find_slot (unfactored_phis, (void *) &up, NO_INSERT);
+ if (slot == NULL)
+ return NULL;
+
+ return (unfactored_phis_t) *slot;
+}
+
+
+/* Lookup PHI node PHI in the table of unfactored PHI nodes. Create a
+ new entry for PHI if needed. */
+
+static unfactored_phis_t
+get_unfactored_phi (tree phi)
+{
+ struct unfactored_phis_d up, *up_p;
+ void **slot;
+
+ if (unfactored_phis == NULL)
+ {
+ unfactored_phis = htab_create (20, unfactored_phis_hash,
+ unfactored_phis_eq, unfactored_phis_free);
+ gcc_assert (first_unfactored_phi == NULL && last_unfactored_phi == NULL);
+ }
+
+ up.phi = phi;
+ slot = htab_find_slot (unfactored_phis, (void *) &up, INSERT);
+ if (*slot == NULL)
+ {
+ up_p = XNEW (struct unfactored_phis_d);
+ up_p->phi = phi;
+ up_p->children = NULL;
+ up_p->next = NULL;
+
+ /* Keep the unfactored PHIs in a single linked list. Since this
+ table is hashed by address, this avoids ordering issues when
+ traversing the hash table in fixup_unfactored_phis. */
+ if (last_unfactored_phi == NULL)
+ {
+ first_unfactored_phi = up_p;
+ last_unfactored_phi = up_p;
+ }
+ else
+ {
+ last_unfactored_phi->next = up_p;
+ last_unfactored_phi = up_p;
+ }
+
+ *slot = (void *) up_p;
+ }
+ else
+ up_p = (unfactored_phis_t) *slot;
+
+ return up_p;
+}
+
+
+/* Split a factored PHI node PHI with multiple reaching definitions
+ for the argument corresponding to edge E. PHI_SYMS is the set of
+ symbols currently factored in PHI, RDEFS is the set of SSA names
+ reaching PHI_SYMS through edge E.
+
+ When we initially compute PHI node placements, memory symbols are
+ all treated as the single object named .MEM. Each PHI node is then
+ associated with a default set of symbols initialized from the sets
+ of stored symbols in the feeding basic blocks (SYMS_TO_FACTOR_IN_BB).
+
+ This grouping allows PHI placement to be efficient at the expense
+ of accuracy. For a given incoming edge E, it may happen that not
+ all the symbols in PHI_SYMS have the same reaching definition. For
+ instance,
+
+ # BLOCK 2
+ # .MEM_11 = VDEF <.MEM_10(D)>
+ D.5771_4 = fwrite (&__gcov_var ...);
+ if (D.5771_4 != 1) goto <L0>; else goto <L1>;
+
+ # BLOCK 3
+ <L0>:;
+ # SFT.124_14 = VDEF <.MEM_11>
+ __gcov_var.error = 1;
+ # SUCC: 4 (fallthru)
+
+ # BLOCK 4
+ # .MEM_9 = PHI <.MEM_11(2), { .MEM_11, SFT.124_14 }(3)>;
+ <L1>:;
+
+ Initially, .MEM_9 is a factored PHI node for all the call clobbered
+ symbols in this program (i.e., all the SFTs for __gcov_var).
+
+ Notice, however, that the second argument for .MEM_9 is reached by
+ two names: SFT.124_14 for the single store to SFT.124 in block #3
+ and .MEM_11 for all the other call-clobbered variables. This means
+ that we need two PHI nodes to split the second argument:
+
+ .MEM_9 = PHI <.MEM_11(2), SFT.124_14(3)>
+ .MEM_15 = PHI <.MEM_11(2), .MEM_11(3)>
+
+ (we do not try to optimize the copy-PHI MEM_15 that we just
+ created)
+
+ Although the new .MEM_15 PHI is reached by .MEM_11 through both
+ edges, it must not be considered a definition for SFT.124. The
+ correct definition for SFT.124 is .MEM_9.
+
+ So, when creating a new PHI node, we explicitly initialize the set
+ of symbols stored in it.
+
+ FIXME, it is not always necessary to add a split PHI node to the
+ list of PHI nodes to post-process. If the dominator children for
+ BB have not been renamed yet, we can just mark PHI and its children
+ as interesting definitions and let the renamer handle everything.
+ This will minimize the number of PHI nodes that need to be
+ post-processed in fixup_unfactored_phis. */
+
+static void
+split_factored_phi (tree phi, edge e, basic_block bb, bitmap phi_syms,
+ bitmap rdefs)
+{
+ unsigned i;
+ bitmap_iterator bi;
+ unfactored_phis_t n;
+ tree phi_lhs;
+
+ timevar_push (TV_TREE_SSA_PHI_UNFACTOR);
+
+ n = get_unfactored_phi (phi);
+ phi_lhs = PHI_RESULT (phi);
+
+ /* Process all the reaching definitions for PHI_SYMS, creating a new
+ PHI node for each one. */
+ EXECUTE_IF_SET_IN_BITMAP (rdefs, 0, i, bi)
+ {
+ edge f;
+ edge_iterator fi;
+ ssa_name_info_p ann;
+ tree rdef, new_phi, rdef_sym;
+ use_operand_p arg_p, new_arg_p;
+
+ rdef = ssa_name (i);
+ rdef_sym = SSA_NAME_VAR (rdef);
+ ann = get_ssa_name_ann (rdef);
+
+ /* Initialize the set of symbols that should be associated
+ with the new PHI node. Only the symbols reached by RDEF
+ should be associated with NEW_PHI. */
+ /* FIXME, we could probably not use REACHED_SYMS here. They are
+ implied by the reaching definition MEM_VAR. */
+ if (bitmap_singleton_p (ann->reached_syms))
+ {
+ tree sym;
+ sym = referenced_var (bitmap_first_set_bit (ann->reached_syms));
+ new_phi = create_phi_node (sym, bb);
+ }
+ else
+ new_phi = create_factored_phi_node (mem_var, bb, ann->reached_syms);
+
+ get_stmt_ann (new_phi)->uid = get_stmt_ann (phi)->uid;
+
+ /* Set the the argument corresponding to edge E. */
+ new_arg_p = PHI_ARG_DEF_PTR_FROM_EDGE (new_phi, e);
+ SET_USE (new_arg_p, rdef);
+
+ /* Set abnormal flags to NEW_PHI and its argument. */
+ if (e->flags & EDGE_ABNORMAL)
+ SSA_NAME_OCCURS_IN_ABNORMAL_PHI (rdef) = 1;
+
+ SSA_NAME_OCCURS_IN_ABNORMAL_PHI (PHI_RESULT (new_phi)) =
+ SSA_NAME_OCCURS_IN_ABNORMAL_PHI (phi_lhs);
+
+ /* Add NEW_PHI to the list of PHI nodes to rewrite. */
+ mark_phi_for_rewrite (bb, new_phi);
+ REGISTER_DEFS_IN_THIS_STMT (new_phi) = 1;
+
+ /* Add NEW_PHI to the list of nodes unfactored out of PHI. */
+ VEC_safe_push (tree, heap, n->children, new_phi);
+
+ /* Every other argument not coming through E must be copied
+ from the original PHI node. The only exception are
+ self-referential arguments. If an argument ARG is the same
+ name as the LHS of the original PHI node, we have to use the
+ LHS of the new child PHI node in its place. */
+ FOR_EACH_EDGE (f, fi, bb->preds)
+ if (e != f)
+ {
+ tree arg;
+ arg_p = PHI_ARG_DEF_PTR_FROM_EDGE (phi, f);
+ arg = USE_FROM_PTR (arg_p);
+ new_arg_p = PHI_ARG_DEF_PTR_FROM_EDGE (new_phi, f);
+ if (arg != phi_lhs)
+ SET_USE (new_arg_p, USE_FROM_PTR (arg_p));
+ else
+ SET_USE (new_arg_p, PHI_RESULT (new_phi));
+ }
+
+ /* The symbols reached by RDEF are now factored in NEW_PHI.
+ Therefore, they must be removed from the set of symbols
+ stored by the original PHI node. */
+ bitmap_and_compl_into (phi_syms, ann->reached_syms);
+ BITMAP_FREE (ann->reached_syms);
+ }
+
+ timevar_pop (TV_TREE_SSA_PHI_UNFACTOR);
+}
+
+
+/* Replace the PHI argument coming through edge E. BB is the block
+ holding PHI. PHI is assumed to be a factored PHI node (i.e., its
+ LHS is an SSA name for .MEM), which means that the argument may
+ have more than one reaching definition. In the presence of multipe
+ reaching definitions, PHI will be split up to accommodate the
+ multiple reaching defs. Return true if PHI was split. Return
+ false otherwise. */
+
+static bool
+replace_factored_phi_argument (tree phi, edge e, basic_block bb)
+{
+ bitmap phi_syms, rdefs;
+ bitmap_iterator bi;
+ unsigned i;
+ tree rdef, rdef_sym, first_rdef, last_rdef;
+
+ rdefs = NULL;
+ last_rdef = NULL_TREE;
+ first_rdef = NULL_TREE;
+
+ phi_syms = get_loads_and_stores (phi)->stores;
+ if (!bitmap_intersect_p (phi_syms, syms_to_rename))
+ {
+ /* If PHI has no symbols to rename and the argument at E does
+ not exist, it means that we have completely unfactored this
+ PHI node. In which case, add .MEM's default definition to
+ avoid confusing the verifier later on. */
+ use_operand_p arg_p = PHI_ARG_DEF_PTR_FROM_EDGE (phi, e);
+ tree arg = USE_FROM_PTR (arg_p);
+ if (arg == NULL)
+ SET_USE (arg_p, get_default_def_for (mem_var));
+
+ return false;
+ }
+
+ /* Traverse all the symbols factored in PHI to see if we need to
+ unfactor it. If the argument corresponding to edge E has more
+ than one reaching definition, then PHI will need to be split to
+ accomodate the multiple reaching defs. */
+ EXECUTE_IF_AND_IN_BITMAP (phi_syms, syms_to_rename, 0, i, bi)
+ {
+ rdef = get_reaching_def (referenced_var (i));
+ rdef_sym = SSA_NAME_VAR (rdef);
+ add_reached_sym (rdef, i);
+
+ /* Remember the first factored reaching definition we find. If
+ we need to unfactor PHI, the first factored reaching
+ definition will stay associated to PHI. If none of the
+ reaching definitions are factored names, then MEM's default
+ definition will be used. */
+ if (first_rdef == NULL_TREE && rdef_sym == mem_var)
+ first_rdef = rdef;
+
+ /* If RDEF is different from the previous one, and it's not the
+ name that we had decided to leave in the original PHI, add it
+ to the set of names that will require new PHI nodes. */
+ if ((last_rdef && rdef != last_rdef && rdef != first_rdef)
+ || rdef_sym != mem_var)
+ {
+ if (rdefs == NULL)
+ rdefs = BITMAP_ALLOC (NULL);
+ bitmap_set_bit (rdefs, SSA_NAME_VERSION (rdef));
+ }
+
+ last_rdef = rdef;
+ }
+
+ /* If we did not find any factored reaching definition, then use
+ .MEM's default definition as the argument. Otherwise, we would
+ be converting this factored PHI node into a non-factored PHI.
+ This will break use-def chains when a subset of symbols are
+ marked for renaming. If all the arguments of this PHI node end
+ up being MEM's default definition, then the PHI will be cleaned
+ up by DCE. */
+ if (first_rdef == NULL_TREE)
+ first_rdef = get_default_def_for (mem_var);
+
+ /* The argument corresponding to edge E is replaced with the first
+ reaching definition we found for PHI_SYMS. */
+ SET_USE (PHI_ARG_DEF_PTR_FROM_EDGE (phi, e), first_rdef);
+ BITMAP_FREE (get_ssa_name_ann (first_rdef)->reached_syms);
+
+ /* If we found multiple reaching definitions, we have to split PHI
+ accordingly. Register PHI in the list of unfactored PHI nodes so
+ that the children PHIs can be post-processed afterwards. */
+ if (rdefs)
+ {
+ split_factored_phi (phi, e, bb, phi_syms, rdefs);
+ BITMAP_FREE (rdefs);
+ return true;
+ }
+
+ return false;
+}
+
+
/* Visit all the successor blocks of BB looking for PHI nodes. For
every PHI node found, check if any of its arguments is in
OLD_SSA_NAMES. If so, and if the argument has a current reaching
@@ -2384,11 +2821,15 @@ rewrite_update_phi_arguments (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
continue;
phis = VEC_index (tree_vec, phis_to_rewrite, e->dest->index);
- for (i = 0; VEC_iterate (tree, phis, i, phi); i++)
+
+ /* Note that we cannot use VEC_iterate here because PHIS may
+ grow when calling replace_factored_phi_argument. */
+ for (i = 0; i < VEC_length (tree, phis); i++)
{
- tree arg;
+ tree arg, lhs_sym;
use_operand_p arg_p;
+ phi = VEC_index (tree, phis, i);
gcc_assert (REWRITE_THIS_STMT (phi));
arg_p = PHI_ARG_DEF_PTR_FROM_EDGE (phi, e);
@@ -2397,35 +2838,40 @@ rewrite_update_phi_arguments (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
if (arg && !DECL_P (arg) && TREE_CODE (arg) != SSA_NAME)
continue;
- if (arg == NULL_TREE)
+ lhs_sym = SSA_NAME_VAR (PHI_RESULT (phi));
+
+ if (arg && TREE_CODE (arg) == SSA_NAME && is_old_name (arg))
{
- /* When updating a PHI node for a recently introduced
- symbol we may find NULL arguments. That's why we
- take the symbol from the LHS of the PHI node. */
- replace_use (arg_p, SSA_NAME_VAR (PHI_RESULT (phi)));
+ /* Process old SSA names first. */
+ replace_use (arg_p, arg);
+ }
+ else if (lhs_sym == mem_var)
+ {
+ /* If this is a factored PHI node, the argument may
+ have multiple reaching definitions, which will
+ require this PHI node to be split up. */
+ replace_factored_phi_argument (phi, e, e->dest);
+
+ /* PHIS may grow, so we need to reload it. */
+ phis = VEC_index (tree_vec, phis_to_rewrite, e->dest->index);
}
else
{
- tree sym, lhs, arg_sym, lhs_sym;
-
- lhs = PHI_RESULT (phi);
- lhs_sym = SSA_NAME_VAR (lhs);
- arg_sym = DECL_P (arg) ? arg : SSA_NAME_VAR (arg);
-
- /* Make sure we use the right symbol when updating PHIs
- for memory symbols. When either the LHS of a PHI or
- the argument is a memory symbol, then use the LHS of
- the PHI as that always contains the right symbol
- being defined by this PHI. */
- if (!is_gimple_reg (lhs_sym) || !is_gimple_reg (arg_sym))
- sym = lhs_sym;
+ tree arg_sym;
+
+ /* When updating a PHI node for a recently introduced
+ symbol we will find NULL arguments. That's why we
+ may need to take the symbol from the LHS of the PHI
+ node. */
+ if (arg == NULL_TREE || SSA_NAME_VAR (arg) == mem_var)
+ arg_sym = lhs_sym;
+ else if (DECL_P (arg))
+ arg_sym = arg;
else
- sym = arg_sym;
+ arg_sym = SSA_NAME_VAR (arg);
- if (symbol_marked_for_renaming (sym))
- replace_use (arg_p, sym);
- else if (is_old_name (arg))
- replace_use (arg_p, arg);
+ if (symbol_marked_for_renaming (arg_sym))
+ replace_use (arg_p, arg_sym);
}
if (e->flags & EDGE_ABNORMAL)
@@ -2435,6 +2881,28 @@ rewrite_update_phi_arguments (struct dom_walk_data *walk_data ATTRIBUTE_UNUSED,
}
+/* Dump unfactored PHI node PHI to FILE. */
+
+void
+dump_unfactored_phi (FILE *file, tree phi)
+{
+ unsigned j;
+ tree child_phi;
+ unfactored_phis_t n = lookup_unfactored_phi (phi);
+
+ if (n && n->children)
+ {
+ dump_loads_and_stores (file, n->phi);
+
+ fprintf (file, "\nChildren PHI nodes:\n");
+ for (j = 0; VEC_iterate (tree, n->children, j, child_phi); j++)
+ dump_loads_and_stores (file, child_phi);
+
+ fprintf (file, "\n");
+ }
+}
+
+
/* Rewrite the actual blocks, statements, and PHI arguments, to be in SSA
form.
@@ -2522,10 +2990,9 @@ static void
mark_def_sites_initialize_block (struct dom_walk_data *walk_data,
basic_block bb ATTRIBUTE_UNUSED)
{
- struct mark_def_sites_global_data *gd =
- (struct mark_def_sites_global_data *) walk_data->global_data;
- bitmap kills = gd->kills;
- bitmap_clear (kills);
+ struct mark_def_sites_global_data *gd;
+ gd = (struct mark_def_sites_global_data *) walk_data->global_data;
+ bitmap_clear (gd->kills);
}
@@ -2600,9 +3067,23 @@ init_ssa_renamer (void)
FOR_EACH_REFERENCED_VAR(var, rvi)
set_current_def (var, NULL_TREE);
- /* Allocate the table to map statements to the symbols they load/store. */
- gcc_assert (mem_syms_tbl == NULL);
- mem_syms_tbl = htab_create (200, mem_syms_hash, mem_syms_eq, mem_syms_free);
+ /* Allocate the array of bitmaps that hold the symbols that have
+ been stored in each basic block. This is used to compute what
+ symbols are associated with factored PHI nodes for .MEM. */
+ gcc_assert (syms_stored_in_bb == NULL);
+ syms_stored_in_bb = XCNEWVEC (bitmap, last_basic_block);
+
+ gcc_assert (syms_with_phi_in_bb == NULL);
+ syms_with_phi_in_bb = XCNEWVEC (bitmap, last_basic_block);
+
+ gcc_assert (syms_to_factor_in_bb == NULL);
+ syms_to_factor_in_bb = XCNEWVEC (bitmap, last_basic_block);
+
+ /* Dominance numbers are assigned to memory SSA names and are used
+ whenever factored PHI nodes have been split (see
+ fixup_unfactored_phis). Dominance numbering starts at 2.
+ Dominance number 1 is reserved for .MEM's default definition. */
+ last_dom_num = 2;
}
@@ -2611,11 +3092,7 @@ init_ssa_renamer (void)
static void
fini_ssa_renamer (void)
{
- if (mem_syms_tbl)
- {
- htab_delete (mem_syms_tbl);
- mem_syms_tbl = NULL;
- }
+ basic_block bb;
if (def_blocks)
{
@@ -2623,6 +3100,30 @@ fini_ssa_renamer (void)
def_blocks = NULL;
}
+ if (syms_stored_in_bb)
+ {
+ FOR_EACH_BB (bb)
+ BITMAP_FREE (syms_stored_in_bb[bb->index]);
+ free (syms_stored_in_bb);
+ syms_stored_in_bb = NULL;
+ }
+
+ if (syms_with_phi_in_bb)
+ {
+ FOR_EACH_BB (bb)
+ BITMAP_FREE (syms_with_phi_in_bb[bb->index]);
+ free (syms_with_phi_in_bb);
+ syms_with_phi_in_bb = NULL;
+ }
+
+ if (syms_to_factor_in_bb)
+ {
+ FOR_EACH_BB (bb)
+ BITMAP_FREE (syms_to_factor_in_bb[bb->index]);
+ free (syms_to_factor_in_bb);
+ syms_to_factor_in_bb = NULL;
+ }
+
in_ssa_p = true;
}
@@ -2666,7 +3167,7 @@ rewrite_into_ssa (void)
sbitmap_zero (interesting_blocks);
/* Initialize dominance frontier. */
- dfs = (bitmap *) xmalloc (last_basic_block * sizeof (bitmap));
+ dfs = XNEWVEC (bitmap, last_basic_block);
FOR_EACH_BB (bb)
dfs[bb->index] = BITMAP_ALLOC (NULL);
@@ -2799,16 +3300,43 @@ prepare_block_for_update (basic_block bb, bool insert_phi_p)
the symbols that we are interested in. */
for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
{
- tree lhs_sym, lhs = PHI_RESULT (phi);
+ tree lhs = PHI_RESULT (phi);
+ tree lhs_sym = SSA_NAME_VAR (lhs);
- lhs_sym = DECL_P (lhs) ? lhs : SSA_NAME_VAR (lhs);
+ if (lhs_sym == mem_var)
+ {
+ bitmap stores = get_loads_and_stores (phi)->stores;
- /* Have to check all the symbols in the arg list of the PHI
- node. */
- if (symbol_marked_for_renaming (lhs_sym))
+ if (bitmap_intersect_p (stores, syms_to_rename))
+ {
+ bitmap_iterator bi;
+ unsigned i;
+
+ /* If symbols currently factored by PHI have been promoted
+ to registers, remove them from the set of factored
+ symbols. */
+ EXECUTE_IF_SET_IN_BITMAP (stores, 0, i, bi)
+ if (is_gimple_reg (referenced_var (i)))
+ bitmap_clear_bit (stores, i);
+
+ add_stored_syms (stores, bb->index);
+ add_syms_with_phi (stores, bb->index);
+ mark_use_interesting (mem_var, phi, bb, insert_phi_p);
+ mark_def_interesting (mem_var, phi, bb, insert_phi_p);
+ }
+ }
+ else if (symbol_marked_for_renaming (lhs_sym))
{
mark_use_interesting (lhs_sym, phi, bb, insert_phi_p);
mark_def_interesting (lhs_sym, phi, bb, insert_phi_p);
+
+ if (!is_gimple_reg (lhs_sym))
+ {
+ add_stored_sym (lhs_sym, bb->index);
+ add_sym_with_phi (lhs_sym, bb->index);
+ mark_use_interesting (mem_var, phi, bb, insert_phi_p);
+ mark_def_interesting (mem_var, phi, bb, insert_phi_p);
+ }
}
}
@@ -2838,39 +3366,38 @@ prepare_block_for_update (basic_block bb, bool insert_phi_p)
mark_def_interesting (def, stmt, bb, insert_phi_p);
}
+ /* If the statement makes memory references, mark this site as a
+ reference site for .MEM. At this point we are not interested
+ in the individual symbols loaded/stored by STMT. We are only
+ interested in computing global live-in information and PHI
+ placement for .MEM. We will refine what symbols need the PHI
+ node in a later pass. */
if (need_to_update_vops_p && stmt_references_memory_p (stmt))
{
- unsigned i;
- bitmap_iterator bi;
- mem_syms_map_t syms;
-
- syms = syms_referenced_by (stmt);
-
- if (syms->stored)
- EXECUTE_IF_SET_IN_BITMAP (syms->stored, 0, i, bi)
- {
- tree sym = referenced_var (i);
- if (symbol_marked_for_renaming (sym))
- {
- mark_use_interesting (sym, stmt, bb, insert_phi_p);
- mark_def_interesting (sym, stmt, bb, insert_phi_p);
- }
- }
-
- if (syms->loaded)
- EXECUTE_IF_SET_IN_BITMAP (syms->loaded, 0, i, bi)
- {
- tree sym = referenced_var (i);
- if (symbol_marked_for_renaming (sym))
- mark_use_interesting (sym, stmt, bb, insert_phi_p);
- }
+ mem_syms_map_t syms = get_loads_and_stores (stmt);
+
+ if (syms->stores
+ && bitmap_intersect_p (syms->stores, syms_to_rename))
+ {
+ mark_use_interesting (mem_var, stmt, bb, insert_phi_p);
+ mark_def_interesting (mem_var, stmt, bb, insert_phi_p);
+
+ /* Add all the stored symbols to SYMS_STORED_IN_BB[BB->INDEX].
+ This will be used to determine what symbols are
+ affected by factored PHI nodes for .MEM. */
+ add_stored_syms (syms->stores, bb->index);
+ }
+
+ if (syms->loads
+ && bitmap_intersect_p (syms->loads, syms_to_rename))
+ mark_use_interesting (mem_var, stmt, bb, insert_phi_p);
}
}
/* Now visit all the blocks dominated by BB. */
for (son = first_dom_son (CDI_DOMINATORS, bb);
- son;
- son = next_dom_son (CDI_DOMINATORS, son))
+ son;
+ son = next_dom_son (CDI_DOMINATORS, son))
prepare_block_for_update (son, insert_phi_p);
}
@@ -3066,11 +3593,7 @@ dump_update_ssa (FILE *file)
if (syms_to_rename && !bitmap_empty_p (syms_to_rename))
{
fprintf (file, "\n\nSymbols to be put in SSA form\n\n");
- EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
- {
- print_generic_expr (file, referenced_var (i), 0);
- fprintf (file, " ");
- }
+ dump_decl_set (file, syms_to_rename);
}
if (names_to_release && !bitmap_empty_p (names_to_release))
@@ -3082,7 +3605,7 @@ dump_update_ssa (FILE *file)
fprintf (file, " ");
}
}
-
+
if (stale_ssa_names && !bitmap_empty_p (stale_ssa_names))
{
fprintf (file, "\n\nSSA names marked stale\n\n");
@@ -3128,6 +3651,7 @@ init_update_ssa (void)
stale_ssa_names = NULL;
memset (&update_ssa_stats, 0, sizeof (update_ssa_stats));
update_ssa_stats.virtual_symbols = BITMAP_ALLOC (NULL);
+ gcc_assert (unfactored_phis == NULL);
}
@@ -3164,6 +3688,26 @@ delete_update_ssa (void)
clear_ssa_name_info ();
fini_ssa_renamer ();
+
+ if (unfactored_phis)
+ {
+ htab_delete (unfactored_phis);
+ unfactored_phis = NULL;
+ first_unfactored_phi = NULL;
+ last_unfactored_phi = NULL;
+ }
+
+ if (blocks_with_phis_to_rewrite)
+ EXECUTE_IF_SET_IN_BITMAP (blocks_with_phis_to_rewrite, 0, i, bi)
+ {
+ tree_vec phis = VEC_index (tree_vec, phis_to_rewrite, i);
+
+ VEC_free (tree, heap, phis);
+ VEC_replace (tree_vec, phis_to_rewrite, i, NULL);
+ }
+
+ BITMAP_FREE (blocks_with_phis_to_rewrite);
+ BITMAP_FREE (blocks_to_update);
}
@@ -3273,7 +3817,7 @@ mark_set_for_renaming (bitmap set)
gcc_assert (get_subvars_for_var (referenced_var (i)) == NULL);
#endif
- if (bitmap_empty_p (set))
+ if (set == NULL || bitmap_empty_p (set))
return;
if (need_to_initialize_update_ssa_p)
@@ -3398,7 +3942,7 @@ insert_updated_phi_nodes_for (tree var, bitmap *dfs, bitmap blocks,
if (TREE_CODE (var) == SSA_NAME)
gcc_assert (is_old_name (var));
else
- gcc_assert (symbol_marked_for_renaming (var));
+ gcc_assert (var == mem_var || symbol_marked_for_renaming (var));
#endif
/* Get all the definition sites for VAR. */
@@ -3408,8 +3952,11 @@ insert_updated_phi_nodes_for (tree var, bitmap *dfs, bitmap blocks,
if (db == NULL || bitmap_empty_p (db->def_blocks))
return;
- /* Compute the initial iterated dominance frontier. */
- idf = find_idf (db->def_blocks, dfs);
+ /* Compute the initial iterated dominance frontier. If VAR is .MEM,
+ computing the IDF will also fill-in the array SYMS_TO_FACTOR_IN_BB
+ so that newly added PHI nodes can be filled in with the initial
+ set of symbols factored in them. */
+ idf = compute_idf (var, db->def_blocks, dfs);
pruned_idf = BITMAP_ALLOC (NULL);
if (TREE_CODE (var) == SSA_NAME)
@@ -3421,7 +3968,6 @@ insert_updated_phi_nodes_for (tree var, bitmap *dfs, bitmap blocks,
common dominator of all the definition blocks. */
entry = nearest_common_dominator_for_set (CDI_DOMINATORS,
db->def_blocks);
-
if (entry != ENTRY_BLOCK_PTR)
EXECUTE_IF_SET_IN_BITMAP (idf, 0, i, bi)
if (BASIC_BLOCK (i) != entry
@@ -3449,6 +3995,9 @@ insert_updated_phi_nodes_for (tree var, bitmap *dfs, bitmap blocks,
are included in the region to be updated. The feeding blocks
are important to guarantee that the PHI arguments are renamed
properly. */
+
+ /* FIXME, this is not needed if we are updating symbols. We are
+ already starting at the ENTRY block anyway. */
bitmap_ior_into (blocks, pruned_idf);
EXECUTE_IF_SET_IN_BITMAP (pruned_idf, 0, i, bi)
{
@@ -3536,28 +4085,42 @@ switch_virtuals_to_full_rewrite (void)
}
-/* While the SSA web is updated, statements that make memory stores
- may need to switch from a factored store into .MEM to a single
- store into the corresponding symbol and vice-versa. For instance,
- suppose that we have this structure assignment before gathering
- call-clobbered variables in the aliaser:
-
- # D.1528_9 = VDEF <.MEM_1(D)>
- D.1528 = g (); --> stores { D.1528 }
-
- Since D.1528 is a structure, we create a VDEF for it. And since we
- have not yet computed call-clobbered variables, we assume that the
- only symbol stored by the assignment is D.1528 itself.
-
- After computing call-clobbered variables, however, the store may
- get converted into a factored store:
-
- # .MEM_10 = VDEF <.MEM_1(D)>
- D.1528 = g (); --> stores { s D.1528 }
-
- But since we had never marked D.1528 for renaming (it was not
- necessary), the immediate uses of D.1528_9 do not get updated. So,
- we must update them at the end. */
+/* If there are any .MEM names that are marked to be released, we
+ need to replace their immediate uses with the default definition
+ for .MEM. Consider this
+
+ struct { ... } x;
+ if (i_12 > 10)
+ # .MEM_39 = VDEF <.MEM_4(D)>
+ x = y;
+ else
+ # .MEM_15 = VDEF <.MEM_4(D)>
+ x = z;
+ endif
+ # .MEM_59 = PHI <.MEM_15, .MEM_39>
+
+ After scalarization
+
+ struct { ... } x;
+ if (i_12 > 10)
+ x$a_40 = y$a_39;
+ x$b_41 = y$b_38;
+ else
+ x$a_45 = y$a_35;
+ x$b_46 = y$b_34;
+ endif
+ # .MEM_59 = PHI <.MEM_15, .MEM_39>
+ # x$a_60 = PHI <x$a_40, x$a_45>
+ # x$b_61 = PHI <x$b_41, x$b_46>
+
+ Both .MEM_15 and .MEM_39 have disappeared and have been marked
+ for removal. But since .MEM is not a symbol that can be marked
+ for renaming, the PHI node for it remains in place. Moreover,
+ because 'x' has been scalarized, there will be no uses of .MEM_59
+ downstream. However, the SSA verifier will see uses of .MEM_15
+ and .MEM_39 and trigger an ICE. By replacing both of them with
+ .MEM's default definition, we placate the verifier and maintain
+ the removability of this PHI node. */
static void
replace_stale_ssa_names (void)
@@ -3566,42 +4129,6 @@ replace_stale_ssa_names (void)
bitmap_iterator bi;
tree new_name;
- /* If there are any .MEM names that are marked to be released, we
- need to replace their immediate uses with the default definition
- for .MEM. Consider this
-
- struct { ... } x;
- if (i_12 > 10)
- # .MEM_39 = VDEF <.MEM_4(D)>
- x = y;
- else
- # .MEM_15 = VDEF <.MEM_4(D)>
- x = z;
- endif
- # .MEM_59 = PHI <.MEM_15, .MEM_39>
-
- After scalarization
-
- struct { ... } x;
- if (i_12 > 10)
- x$a_40 = y$a_39;
- x$b_41 = y$b_38;
- else
- x$a_45 = y$a_35;
- x$b_46 = y$b_34;
- endif
- # .MEM_59 = PHI <.MEM_15, .MEM_39>
- # x$a_60 = PHI <x$a_40, x$a_45>
- # x$b_61 = PHI <x$b_41, x$b_46>
-
- Both .MEM_15 and .MEM_39 have disappeared and have been marked
- for removal. But since .MEM is not a symbol that can be marked
- for renaming, the PHI node for it remains in place. Moreover,
- because 'x' has been scalarized, there will be no uses of .MEM_59
- downstream. However, the SSA verifier will see uses of .MEM_15
- and .MEM_39 and trigger an ICE. By replacing both of them with
- .MEM's default definition, we placate the verifier and maintain
- the removability of this PHI node. */
if (names_to_release)
{
new_name = get_default_def_for (mem_var);
@@ -3657,6 +4184,289 @@ replace_stale_ssa_names (void)
}
+
+/* Helper for fixup_unfactored_phis. Add all the immediate uses for
+ SSA name PHI_LHS to *PHI_QUEUE_P or *STMT_QUEUE_P accordingly.
+ STMTS_ADDED is the set of statements that have already been added
+ to one of the queues. */
+
+static void
+add_to_fixup_queues (tree phi_lhs,
+ VEC(tree, heap) **phi_queue_p,
+ VEC(tree, heap) **stmt_queue_p,
+ htab_t stmts_added)
+{
+ imm_use_iterator imm_iter;
+ void **slot;
+ tree stmt;
+
+ FOR_EACH_IMM_USE_STMT (stmt, imm_iter, phi_lhs)
+ {
+ slot = htab_find_slot (stmts_added, stmt, INSERT);
+ if (*slot == NULL)
+ {
+ if (TREE_CODE (stmt) == PHI_NODE)
+ VEC_safe_push (tree, heap, *phi_queue_p, stmt);
+ else
+ VEC_safe_push (tree, heap, *stmt_queue_p, stmt);
+
+ *slot = stmt;
+ }
+ }
+}
+
+
+/* Helper for fixup_unfactored_phis. Set CURRDEF for all the symbols
+ factored in NAME's defining statement. If NAME is created by an
+ unfactored PHI node, recursively inspect its children. */
+
+static void
+compute_currdefs_for (tree name)
+{
+ bitmap syms;
+ bitmap_iterator bi;
+ unsigned i;
+ unfactored_phis_t n;
+ tree sym, child, stmt;
+
+ /* The default definition for .MEM is a catchall name that only
+ reaches symbols that have not been defined otherwise. */
+ if (name == default_def (mem_var))
+ return;
+
+ /* The name for a regular memory symbols only reaches that symbol. */
+ sym = SSA_NAME_VAR (name);
+ if (sym != mem_var)
+ {
+ bitmap_set_bit (syms_to_rename, DECL_UID (sym));
+ set_current_def (sym, name);
+ return;
+ }
+
+ /* Otherwise, get all the symbols associated to this .MEM name. */
+ stmt = SSA_NAME_DEF_STMT (name);
+ syms = get_loads_and_stores (stmt)->stores;
+ bitmap_ior_into (syms_to_rename, syms);
+ EXECUTE_IF_SET_IN_BITMAP (syms, 0, i, bi)
+ set_current_def (referenced_var (i), name);
+
+ /* If the defining statement is an unfactored PHI node, examine its
+ children PHI nodes. */
+ if ((n = lookup_unfactored_phi (stmt)) != NULL)
+ for (i = 0; VEC_iterate (tree, n->children, i, child); i++)
+ compute_currdefs_for (PHI_RESULT (child));
+}
+
+
+/* For every unfactored PHI node P, process every immediate use
+ through the renamer to account for the unfactoring. Given
+
+ .MEM_10 = PHI <.MEM_3, ???> { a, b, c, d }
+
+ Suppose that the second argument of .MEM_10 is reached by three
+ different names: .MEM_5, c_8 and d_7. This would have caused the
+ following splitting of .MEM_10:
+
+ .MEM_10 = PHI <.MEM_3, .MEM_5> { a, b }
+ c_11 = PHI <.MEM_3, c_8> { c }
+ d_12 = PHI <.MEM_3, d_7> { d }
+
+ Now, suppose that one of .MEM_10's immediate uses is the statement:
+
+ x_32 = *p_1 { a, b, c, d }
+
+ If x_32 was renamed *before* .MEM_10 was split, the renamer would
+ have created:
+
+ # VUSE <.MEM_10>
+ x_32 = *p_1
+
+ But given the subsequent split, this is wrong because .MEM_10 only
+ factors symbols { a, b }. Therefore, we traverse all the immediate
+ uses for unfactored PHI nodes and pass them through the renamer one
+ more time. This way, x_32 can be renamed to:
+
+ # VUSE <.MEM_10, c_11, d_12>
+ x_32 = *p_1
+
+ Note that this process is only ever done for PHI nodes whose
+ immediate uses were renamed *before* the PHI node was split. If we
+ had managed to split the PHI node before renaming all its immediate
+ uses, we wouldn't need this post-processing.
+
+ Notice that, in general, it is not possible to guarantee the order
+ in which basic blocks will be renamed. When doing the dominator
+ walk, we will sometimes visit the children of a given basic block
+ before the block itself. This is particularly true in the case of
+ loops. */
+
+static void
+fixup_unfactored_phis (void)
+{
+ unfactored_phis_t n;
+ htab_t stmts_added;
+ unsigned stmt_ix;
+ VEC(tree, heap) *stmt_queue = NULL;
+ VEC(tree, heap) *phi_queue = NULL;
+
+ timevar_push (TV_TREE_SSA_FIX_UNFACTORED_UD);
+
+ /* Add immediate uses for every unfactored PHI node to STMT_QUEUE or
+ PHI_QUEUE accordingly. */
+ stmts_added = htab_create (50, htab_hash_pointer, htab_eq_pointer, NULL);
+
+ for (n = first_unfactored_phi; n; n = n->next)
+ add_to_fixup_queues (PHI_RESULT (n->phi), &phi_queue, &stmt_queue,
+ stmts_added);
+
+ /* PHI nodes in PHI_QUEUE may need to be split, and they may also
+ cause more PHI nodes to be split in turn. */
+ for (stmt_ix = 0; stmt_ix < VEC_length (tree, phi_queue); stmt_ix++)
+ {
+ int j;
+ tree phi = VEC_index (tree, phi_queue, stmt_ix);
+ tree phi_lhs = PHI_RESULT (phi);
+ bool split_p = false;
+
+ /* One or more arguments of PHI will be an unfactored PHI
+ node. Compute CURRDEF for all the symbols stored by that
+ argument (and its children PHI nodes), and rewrite
+ PHI's argument. */
+ for (j = 0; j < PHI_NUM_ARGS (phi); j++)
+ {
+ tree arg, arg_phi;
+ use_operand_p arg_p;
+ edge e;
+
+ e = PHI_ARG_EDGE (phi, j);
+ arg = PHI_ARG_DEF (phi, j);
+ arg_p = PHI_ARG_DEF_PTR (phi, j);
+
+ /* Ignore self-referential arguments. */
+ if (arg == phi_lhs)
+ continue;
+
+ arg_phi = SSA_NAME_DEF_STMT (arg);
+ if (TREE_CODE (arg_phi) == PHI_NODE
+ && lookup_unfactored_phi (arg_phi))
+ {
+ /* If ARG is an unfactored PHI, its set of factored
+ symbols may have changed after this argument was
+ added by the renamer. We need to recompute the
+ reaching definitions for all the symbols factored in
+ PHI and see if that causes PHI to be unfactored. */
+ tree sym;
+
+ bitmap_clear (syms_to_rename);
+ compute_currdefs_for (arg);
+ sym = SSA_NAME_VAR (PHI_RESULT (phi));
+ if (sym == mem_var)
+ split_p |= replace_factored_phi_argument (phi, e, e->dest);
+ else
+ replace_use (arg_p, sym);
+ }
+ }
+
+ /* If we had to split PHI while examining its arguments, add
+ PHI's immediate uses to the fixup queues. */
+ if (split_p)
+ add_to_fixup_queues (PHI_RESULT (phi), &phi_queue, &stmt_queue,
+ stmts_added);
+
+ /* Allow PHI to be added to the fixup queues again. In the case
+ of loops, two or more PHI nodes could be in a dependency
+ cycle. Each will need to be visited twice before the
+ splitting stabilizes. FIXME, prove. */
+ htab_remove_elt (stmts_added, (PTR) phi);
+ }
+
+ /* Once all the PHI nodes have been split, rewrite the operands of
+ every affected statement. */
+ for (stmt_ix = 0; stmt_ix < VEC_length (tree, stmt_queue); stmt_ix++)
+ {
+ ssa_op_iter iter;
+ tree stmt, use, *sorted_names;
+ bool rewrite_p;
+ int i, last, num_vops;
+
+ bitmap_clear (syms_to_rename);
+
+ stmt = VEC_index (tree, stmt_queue, stmt_ix);
+
+ /* Sort VOPS in dominance numbering order. This way, we
+ guarantee that CURRDEFs will be computed in the right order.
+ Suppose that STMT was originally:
+
+ # VUSE <.MEM_14, .MEM_16>
+ x_1 = *p_3
+
+ and both MEM_14 and MEM_16 are factored PHI nodes that were
+ split after STMT had been renamed. We now need to replace
+ MEM_14 and MEM_16 with their respective children PHI nodes,
+ but since both names are factoring the same symbols, we have
+ to process them in dominator order.
+
+ This is were the dominance number is used. (1) Since both
+ MEM_14 and MEM_16 reach STMT, we know that they must be on
+ the same dominance sub-tree (otherwise, they would have both
+ been merged into a PHI node), (2) so, if the dominance number
+ of MEM_14 is greater than the dominance number of MEM_16, it
+ means that MEM_14 post-dominates MEM_16.
+
+ Therefore, all the definitions made by MEM_14 occur *after*
+ those made by MEM_16. So, before computing current
+ definitions, we sort the names by ascending dominance number.
+ This way, symbols will be assigned a CURRDEF in the correct
+ dominator ordering. */
+ num_vops = num_ssa_operands (stmt, SSA_OP_VIRTUAL_USES);
+ sorted_names = XCNEWVEC (tree, num_vops);
+ last = -1;
+ FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter, SSA_OP_VIRTUAL_USES)
+ {
+ unsigned int dn = get_name_dom_num (use);
+ gcc_assert (dn > 0);
+
+ i = last;
+ while (i >= 0 && get_name_dom_num (sorted_names[i]) > dn)
+ {
+ sorted_names[i + 1] = sorted_names[i];
+ i--;
+ }
+ sorted_names[i + 1] = use;
+ last++;
+ }
+
+ /* Now traverse the sorted list computing CURRDEFs for all the
+ reaching names. */
+ rewrite_p = false;
+ for (i = 0; i < num_vops; i++)
+ {
+ tree use = sorted_names[i];
+ tree def_stmt = SSA_NAME_DEF_STMT (use);
+
+ compute_currdefs_for (use);
+
+ /* We only need to rewrite STMT's operands if DEF_STMT is an
+ unfactored PHI node. */
+ if (TREE_CODE (def_stmt) == PHI_NODE
+ && lookup_unfactored_phi (def_stmt))
+ rewrite_p = true;
+ }
+
+ free (sorted_names);
+
+ if (rewrite_p)
+ rewrite_update_memory_stmt (stmt, false);
+ }
+
+ VEC_free (tree, heap, phi_queue);
+ VEC_free (tree, heap, stmt_queue);
+ htab_delete (stmts_added);
+
+ timevar_pop (TV_TREE_SSA_FIX_UNFACTORED_UD);
+}
+
+
/* Given a set of newly created SSA names (NEW_SSA_NAMES) and a set of
existing SSA names (OLD_SSA_NAMES), update the SSA form so that:
@@ -3812,10 +4622,10 @@ update_ssa (unsigned update_flags)
updating. For now this seems more work than it's worth. */
start_bb = ENTRY_BLOCK_PTR;
- /* Traverse the CFG looking for definitions and uses of symbols
- in SYMS_TO_RENAME. Mark interesting blocks and statements
- and set local live-in information for the PHI placement
- heuristics. */
+ /* Traverse the CFG looking for existing definitions and uses of
+ symbols in SYMS_TO_RENAME. Mark interesting blocks and
+ statements and set local live-in information for the PHI
+ placement heuristics. */
prepare_block_for_update (start_bb, insert_phi_p);
}
else
@@ -3857,9 +4667,22 @@ update_ssa (unsigned update_flags)
sbitmap_free (tmp);
}
+ /* When updating virtual operands, insert PHI nodes for .MEM.
+ If needed, they will be split into individual symbol PHI
+ nodes during renaming. */
+ if (need_to_update_vops_p)
+ insert_updated_phi_nodes_for (mem_var, dfs, blocks_to_update,
+ update_flags);
+
EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
- insert_updated_phi_nodes_for (referenced_var (i), dfs,
- blocks_to_update, update_flags);
+ {
+ /* We don't need to process virtual symbols here, as they
+ have been all handled by the .MEM PHI nodes above. */
+ tree sym = referenced_var (i);
+ if (is_gimple_reg (sym))
+ insert_updated_phi_nodes_for (referenced_var (i), dfs,
+ blocks_to_update, update_flags);
+ }
FOR_EACH_BB (bb)
BITMAP_FREE (dfs[bb->index]);
@@ -3889,13 +4712,18 @@ update_ssa (unsigned update_flags)
rewrite_blocks (start_bb, REWRITE_UPDATE, tmp);
+ sbitmap_free (tmp);
+
/* If the update process generated stale SSA names, their immediate
uses need to be replaced with the new name that was created in
their stead. */
- if (stale_ssa_names || names_to_release)
+ if (names_to_release || stale_ssa_names)
replace_stale_ssa_names ();
- sbitmap_free (tmp);
+ /* If the renamer had to split factored PHI nodes, we need to adjust
+ the immediate uses for the split PHI nodes. */
+ if (unfactored_phis)
+ fixup_unfactored_phis ();
/* Debugging dumps. */
if (dump_file)
@@ -3929,15 +4757,6 @@ update_ssa (unsigned update_flags)
/* Free allocated memory. */
done:
in_ssa_p = true;
- EXECUTE_IF_SET_IN_BITMAP (blocks_with_phis_to_rewrite, 0, i, bi)
- {
- tree_vec phis = VEC_index (tree_vec, phis_to_rewrite, i);
-
- VEC_free (tree, heap, phis);
- VEC_replace (tree_vec, phis_to_rewrite, i, NULL);
- }
- BITMAP_FREE (blocks_with_phis_to_rewrite);
- BITMAP_FREE (blocks_to_update);
delete_update_ssa ();
timevar_pop (TV_TREE_SSA_INCREMENTAL);
diff --git a/gcc/tree-nrv.c b/gcc/tree-nrv.c
index 0238a004423..301484e9f0e 100644
--- a/gcc/tree-nrv.c
+++ b/gcc/tree-nrv.c
@@ -263,7 +263,7 @@ execute_return_slot_opt (void)
&& stmt_references_memory_p (stmt)
&& aggregate_value_p (call, call))
{
- bitmap loads, stores;
+ mem_syms_map_t mp;
unsigned i;
bitmap_iterator bi;
@@ -271,12 +271,11 @@ execute_return_slot_opt (void)
asking whether it is call clobbered. When the LHS isn't a
simple decl, we need to check the VDEFs, so it's simplest
to just loop through all the stored symbols. */
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, loads, stores);
- EXECUTE_IF_SET_IN_BITMAP (stores, 0, i, bi)
- if (is_call_clobbered (referenced_var (i)))
- goto unsafe;
+ mp = get_loads_and_stores (stmt);
+ if (mp->stores)
+ EXECUTE_IF_SET_IN_BITMAP (mp->stores, 0, i, bi)
+ if (is_call_clobbered (referenced_var (i)))
+ goto unsafe;
/* No defs are call clobbered, so the optimization is safe. */
CALL_EXPR_RETURN_SLOT_OPT (call) = 1;
@@ -284,9 +283,7 @@ execute_return_slot_opt (void)
/* This is too late to mark the target addressable like we do
in gimplify_modify_expr_rhs, but that's OK; anything that
wasn't already addressable was handled there. */
- unsafe:
- BITMAP_FREE (stores);
- BITMAP_FREE (loads);
+ unsafe: ;
}
}
}
diff --git a/gcc/tree-outof-ssa.c b/gcc/tree-outof-ssa.c
index de961b38e2f..ba0e7c03a75 100644
--- a/gcc/tree-outof-ssa.c
+++ b/gcc/tree-outof-ssa.c
@@ -1119,7 +1119,7 @@ eliminate_virtual_phis (void)
}
}
#endif
- remove_phi_node (phi, NULL_TREE);
+ remove_phi_node (phi, NULL_TREE, true);
}
}
}
@@ -2394,13 +2394,13 @@ remove_ssa_form (var_map map, int flags)
if (values)
free (values);
- /* Remove phi nodes which have been translated back to real variables. */
+ /* Remove PHI nodes which have been translated back to real variables. */
FOR_EACH_BB (bb)
{
for (phi = phi_nodes (bb); phi; phi = next)
{
next = PHI_CHAIN (phi);
- remove_phi_node (phi, NULL_TREE);
+ remove_phi_node (phi, NULL_TREE, true);
}
}
diff --git a/gcc/tree-pass.h b/gcc/tree-pass.h
index 0664c7eb841..e96ed8da9f2 100644
--- a/gcc/tree-pass.h
+++ b/gcc/tree-pass.h
@@ -68,6 +68,8 @@ enum tree_dump_index
#define TDF_STMTADDR (1 << 12) /* Address of stmt. */
#define TDF_GRAPH (1 << 13) /* a graph dump is being emitted */
+#define TDF_MEMSYMS (1 << 14) /* display memory symbols in expr.
+ Implies TDF_VOPS. */
extern char *get_dump_file_name (enum tree_dump_index);
extern int dump_enabled_p (enum tree_dump_index);
diff --git a/gcc/tree-phinodes.c b/gcc/tree-phinodes.c
index 98b011f89e6..7dcda8f206f 100644
--- a/gcc/tree-phinodes.c
+++ b/gcc/tree-phinodes.c
@@ -233,6 +233,7 @@ make_phi_node (tree var, int len)
imm->next = NULL;
imm->stmt = phi;
}
+
return phi;
}
@@ -252,6 +253,8 @@ release_phi_node (tree phi)
delink_imm_use (imm);
}
+ delete_loads_and_stores (phi);
+
bucket = len > NUM_BUCKETS - 1 ? NUM_BUCKETS - 1 : len;
bucket -= 2;
PHI_CHAIN (phi) = free_phinodes[bucket];
@@ -301,6 +304,8 @@ resize_phi_node (tree *phi, int len)
imm->stmt = new_phi;
}
+ if (stmt_references_memory_p (*phi))
+ move_loads_and_stores (new_phi, *phi);
*phi = new_phi;
}
@@ -343,10 +348,11 @@ reserve_phi_args_for_new_edge (basic_block bb)
}
}
+
/* Create a new PHI node for variable VAR at basic block BB. */
-tree
-create_phi_node (tree var, basic_block bb)
+static tree
+create_phi_node_1 (tree var, basic_block bb)
{
tree phi;
@@ -362,6 +368,42 @@ create_phi_node (tree var, basic_block bb)
return phi;
}
+
+/* Create a new PHI node for variable VAR at basic block BB. */
+
+tree
+create_phi_node (tree var, basic_block bb)
+{
+#if defined ENABLE_CHECKING
+ /* Factored PHI nodes should be created using create_factored_phi_node. */
+ tree sym = (TREE_CODE (var) == SSA_NAME) ? SSA_NAME_VAR (var) : var;
+ gcc_assert (sym != mem_var);
+#endif
+
+ return create_phi_node_1 (var, bb);
+}
+
+
+/* Create a factored PHI node at basic block BB. Associate it with
+ the set of symbols SYMS. VAR must be either .MEM or an SSA name
+ for .MEM. */
+
+tree
+create_factored_phi_node (tree var, basic_block bb, bitmap syms)
+{
+ tree phi;
+
+ gcc_assert (var == mem_var || SSA_NAME_VAR (var) == mem_var);
+
+ phi = create_phi_node_1 (var, bb);
+
+ if (syms)
+ add_loads_and_stores (phi, syms, syms);
+
+ return phi;
+}
+
+
/* Add a new argument to PHI node PHI. DEF is the incoming reaching
definition and E is the edge through which DEF reaches PHI. The new
argument is added at the end of the argument list.
@@ -394,6 +436,7 @@ add_phi_arg (tree phi, tree def, edge e)
SET_PHI_ARG_DEF (phi, e->dest_idx, def);
}
+
/* Remove the Ith argument from PHI's argument list. This routine
implements removal by swapping the last alternative with the
alternative we want to delete and then shrinking the vector, which
@@ -406,7 +449,6 @@ remove_phi_arg_num (tree phi, int i)
gcc_assert (i < num_elem);
-
/* Delink the item which is being removed. */
delink_imm_use (&(PHI_ARG_IMM_USE_NODE (phi, i)));
@@ -428,6 +470,7 @@ remove_phi_arg_num (tree phi, int i)
PHI_NUM_ARGS (phi)--;
}
+
/* Remove all PHI arguments associated with edge E. */
void
@@ -439,11 +482,14 @@ remove_phi_args (edge e)
remove_phi_arg_num (phi, e->dest_idx);
}
+
/* Remove PHI node PHI from basic block BB. If PREV is non-NULL, it is
- used as the node immediately before PHI in the linked list. */
+ used as the node immediately before PHI in the linked list. If
+ RELEASE_LHS_P is true, the LHS of this PHI node is released into
+ the free pool of SSA names. */
void
-remove_phi_node (tree phi, tree prev)
+remove_phi_node (tree phi, tree prev, bool release_lhs_p)
{
tree *loc;
@@ -465,7 +511,8 @@ remove_phi_node (tree phi, tree prev)
/* If we are deleting the PHI node, then we should release the
SSA_NAME node so that it can be reused. */
release_phi_node (phi);
- release_ssa_name (PHI_RESULT (phi));
+ if (release_lhs_p)
+ release_ssa_name (PHI_RESULT (phi));
}
diff --git a/gcc/tree-pretty-print.c b/gcc/tree-pretty-print.c
index 1f42b2a6c80..c843af6890e 100644
--- a/gcc/tree-pretty-print.c
+++ b/gcc/tree-pretty-print.c
@@ -87,14 +87,14 @@ do_niy (pretty_printer *buffer, tree node)
void
debug_generic_expr (tree t)
{
- print_generic_expr (stderr, t, TDF_VOPS|TDF_UID);
+ print_generic_expr (stderr, t, TDF_VOPS|TDF_MEMSYMS);
fprintf (stderr, "\n");
}
void
debug_generic_stmt (tree t)
{
- print_generic_stmt (stderr, t, TDF_VOPS|TDF_UID);
+ print_generic_stmt (stderr, t, TDF_VOPS|TDF_MEMSYMS);
fprintf (stderr, "\n");
}
@@ -103,7 +103,7 @@ debug_tree_chain (tree t)
{
while (t)
{
- print_generic_expr (stderr, t, TDF_VOPS|TDF_UID);
+ print_generic_expr (stderr, t, TDF_VOPS|TDF_MEMSYMS|TDF_UID);
fprintf(stderr, " ");
t = TREE_CHAIN (t);
}
@@ -402,6 +402,33 @@ dump_omp_clauses (pretty_printer *buffer, tree clause, int spc, int flags)
}
+/* Dump the set of decls SYMS. BUFFER, SPC and FLAGS are as in
+ dump_generic_node. */
+
+static void
+dump_symbols (pretty_printer *buffer, bitmap syms, int flags)
+{
+ unsigned i;
+ bitmap_iterator bi;
+
+ if (syms == NULL)
+ pp_string (buffer, "NIL");
+ else
+ {
+ pp_string (buffer, " { ");
+
+ EXECUTE_IF_SET_IN_BITMAP (syms, 0, i, bi)
+ {
+ tree sym = referenced_var_lookup (i);
+ dump_generic_node (buffer, sym, 0, flags, false);
+ pp_string (buffer, " ");
+ }
+
+ pp_string (buffer, "}");
+ }
+}
+
+
/* Dump the node NODE on the pretty_printer BUFFER, SPC spaces of indent.
FLAGS specifies details to show in the dump (see TDF_* in tree.h). If
IS_STMT is true, the object printed is considered to be a statement
@@ -423,7 +450,7 @@ dump_generic_node (pretty_printer *buffer, tree node, int spc, int flags,
if (TREE_CODE (node) != ERROR_MARK
&& is_gimple_stmt (node)
- && (flags & TDF_VOPS)
+ && (flags & (TDF_VOPS|TDF_MEMSYMS))
&& stmt_ann (node)
&& TREE_CODE (node) != PHI_NODE)
dump_vops (buffer, node, spc, flags);
@@ -1624,6 +1651,9 @@ dump_generic_node (pretty_printer *buffer, tree node, int spc, int flags,
pp_string (buffer, ", ");
}
pp_string (buffer, ">;");
+
+ if (stmt_references_memory_p (node) && (flags & TDF_MEMSYMS))
+ dump_symbols (buffer, get_loads_and_stores (node)->stores, flags);
}
break;
@@ -2572,18 +2602,22 @@ newline_and_indent (pretty_printer *buffer, int spc)
INDENT (spc);
}
+
static void
dump_vops (pretty_printer *buffer, tree stmt, int spc, int flags)
{
struct vdef_optype_d *vdefs;
struct vuse_optype_d *vuses;
size_t i, n;
+ mem_syms_map_t syms;
- if (!ssa_operands_active ())
+ if (!ssa_operands_active () || !stmt_references_memory_p (stmt))
return;
+ syms = (flags & TDF_MEMSYMS) ? get_loads_and_stores (stmt) : NULL;
+
vuses = VUSE_OPS (stmt);
- while (vuses)
+ if (vuses)
{
pp_string (buffer, "# VUSE <");
@@ -2596,12 +2630,16 @@ dump_vops (pretty_printer *buffer, tree stmt, int spc, int flags)
}
pp_string (buffer, ">");
- newline_and_indent (buffer, spc);
vuses = vuses->next;
+
+ if (syms)
+ dump_symbols (buffer, syms->loads, flags);
+
+ newline_and_indent (buffer, spc);
}
vdefs = VDEF_OPS (stmt);
- while (vdefs)
+ if (vdefs)
{
pp_string (buffer, "# ");
dump_generic_node (buffer, VDEF_RESULT (vdefs), spc + 2, flags, false);
@@ -2616,8 +2654,12 @@ dump_vops (pretty_printer *buffer, tree stmt, int spc, int flags)
}
pp_string (buffer, ">");
- newline_and_indent (buffer, spc);
vdefs = vdefs->next;
+
+ if (syms)
+ dump_symbols (buffer, syms->stores, flags);
+
+ newline_and_indent (buffer, spc);
}
}
diff --git a/gcc/tree-scalar-evolution.c b/gcc/tree-scalar-evolution.c
index 52b0ba38b67..ca1b243e2d7 100644
--- a/gcc/tree-scalar-evolution.c
+++ b/gcc/tree-scalar-evolution.c
@@ -2741,8 +2741,9 @@ scev_const_prop (void)
}
}
- /* Remove the ssa names that were replaced by constants. We do not remove them
- directly in the previous cycle, since this invalidates scev cache. */
+ /* Remove the ssa names that were replaced by constants. We do not
+ remove them directly in the previous cycle, since this
+ invalidates scev cache. */
if (ssa_names_to_remove)
{
bitmap_iterator bi;
@@ -2754,7 +2755,7 @@ scev_const_prop (void)
phi = SSA_NAME_DEF_STMT (name);
gcc_assert (TREE_CODE (phi) == PHI_NODE);
- remove_phi_node (phi, NULL);
+ remove_phi_node (phi, NULL, true);
}
BITMAP_FREE (ssa_names_to_remove);
@@ -2815,11 +2816,10 @@ scev_const_prop (void)
|| contains_abnormal_ssa_name_p (def))
continue;
- /* Eliminate the phi node and replace it by a computation outside
+ /* Eliminate the PHI node and replace it by a computation outside
the loop. */
def = unshare_expr (def);
- SET_PHI_RESULT (phi, NULL_TREE);
- remove_phi_node (phi, NULL_TREE);
+ remove_phi_node (phi, NULL_TREE, false);
ass = build2 (MODIFY_EXPR, void_type_node, rslt, NULL_TREE);
SSA_NAME_DEF_STMT (rslt) = ass;
diff --git a/gcc/tree-ssa-alias.c b/gcc/tree-ssa-alias.c
index 8c92dc1dc26..a06342ce54b 100644
--- a/gcc/tree-ssa-alias.c
+++ b/gcc/tree-ssa-alias.c
@@ -513,7 +513,6 @@ compute_may_aliases (void)
avoid invalid transformations on them. */
maybe_create_global_var (ai);
-
/* If the program contains ref-all pointers, finalize may-alias information
for them. This pass needs to be run after call-clobbering information
has been computed. */
@@ -762,6 +761,15 @@ init_alias_info (void)
}
}
}
+ else
+ {
+ /* If this is the first time we compute aliasing information,
+ every non-register symbol will need to be put into SSA form
+ (the initial SSA form only operates on GIMPLE registers). */
+ FOR_EACH_REFERENCED_VAR (var, rvi)
+ if (!is_gimple_reg (var))
+ mark_sym_for_renaming (var);
+ }
/* Next time, we will need to reset alias information. */
aliases_computed_p = true;
@@ -1287,11 +1295,12 @@ setup_pointers_and_addressables (struct alias_info *ai)
/* Global variables and addressable locals may be aliased. Create an
entry in ADDRESSABLE_VARS for VAR. */
- if (may_be_aliased (var)
- && (!var_can_have_subvars (var)
- || get_subvars_for_var (var) == NULL))
+ if (may_be_aliased (var))
{
- create_alias_map_for (var, ai);
+ if (!var_can_have_subvars (var)
+ || get_subvars_for_var (var) == NULL)
+ create_alias_map_for (var, ai);
+
mark_sym_for_renaming (var);
}
@@ -1354,6 +1363,7 @@ setup_pointers_and_addressables (struct alias_info *ai)
}
}
}
+
VEC_free (tree, heap, varvec);
}
@@ -2018,16 +2028,8 @@ dump_points_to_info_for (FILE *file, tree ptr)
if (pi->pt_vars)
{
- unsigned ix;
- bitmap_iterator bi;
-
- fprintf (file, ", points-to vars: { ");
- EXECUTE_IF_SET_IN_BITMAP (pi->pt_vars, 0, ix, bi)
- {
- print_generic_expr (file, referenced_var (ix), dump_flags);
- fprintf (file, " ");
- }
- fprintf (file, "}");
+ fprintf (file, ", points-to vars: ");
+ dump_decl_set (file, pi->pt_vars);
}
}
diff --git a/gcc/tree-ssa-dce.c b/gcc/tree-ssa-dce.c
index 04ae8e4fbb3..d7372ae6e14 100644
--- a/gcc/tree-ssa-dce.c
+++ b/gcc/tree-ssa-dce.c
@@ -618,7 +618,7 @@ remove_dead_phis (basic_block bb)
fprintf (dump_file, "\n");
}
- remove_phi_node (phi, prev);
+ remove_phi_node (phi, prev, true);
stats.removed_phis++;
phi = next;
}
diff --git a/gcc/tree-ssa-dom.c b/gcc/tree-ssa-dom.c
index b23a5032faf..8b7ce0a9aaf 100644
--- a/gcc/tree-ssa-dom.c
+++ b/gcc/tree-ssa-dom.c
@@ -2098,7 +2098,7 @@ static void
remove_stmt_or_phi (tree t)
{
if (TREE_CODE (t) == PHI_NODE)
- remove_phi_node (t, NULL);
+ remove_phi_node (t, NULL, true);
else
{
block_stmt_iterator bsi = bsi_for_stmt (t);
diff --git a/gcc/tree-ssa-loop-im.c b/gcc/tree-ssa-loop-im.c
index e5d781d508e..6d14d80ca95 100644
--- a/gcc/tree-ssa-loop-im.c
+++ b/gcc/tree-ssa-loop-im.c
@@ -916,22 +916,16 @@ free_mem_ref_locs (struct mem_ref_loc *mem_refs)
static void
rewrite_mem_refs (tree tmp_var, struct mem_ref_loc *mem_refs)
{
- bitmap loads = BITMAP_ALLOC (NULL);
- bitmap stores = BITMAP_ALLOC (NULL);
+ mem_syms_map_t mp;
for (; mem_refs; mem_refs = mem_refs->next)
{
- bitmap_clear (loads);
- bitmap_clear (stores);
- get_loads_and_stores (mem_refs->stmt, loads, stores);
- mark_set_for_renaming (stores);
- mark_set_for_renaming (loads);
+ mp = get_loads_and_stores (mem_refs->stmt);
+ mark_set_for_renaming (mp->stores);
+ mark_set_for_renaming (mp->loads);
*mem_refs->ref = tmp_var;
update_stmt (mem_refs->stmt);
}
-
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
}
/* The name and the length of the currently generated variable
@@ -1217,14 +1211,11 @@ gather_mem_refs_stmt (struct loop *loop, htab_t mem_refs,
PTR *slot;
struct mem_ref *ref = NULL;
bool is_stored;
- bitmap loads, stores;
+ mem_syms_map_t mp;
if (ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS))
return;
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
-
/* Recognize MEM = (SSA_NAME | invariant) and SSA_NAME = MEM patterns. */
if (TREE_CODE (stmt) != MODIFY_EXPR)
goto fail;
@@ -1277,20 +1268,25 @@ gather_mem_refs_stmt (struct loop *loop, htab_t mem_refs,
}
ref->is_stored |= is_stored;
- get_loads_and_stores (stmt, loads, stores);
- bitmap_ior_into (ref->vops, loads);
- bitmap_ior_into (ref->vops, stores);
+ mp = get_loads_and_stores (stmt);
+
+ if (mp->loads)
+ bitmap_ior_into (ref->vops, mp->loads);
+
+ if (mp->stores)
+ bitmap_ior_into (ref->vops, mp->stores);
+
record_mem_ref_loc (&ref->locs, stmt, mem);
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
return;
fail:
- get_loads_and_stores (stmt, loads, stores);
- bitmap_ior_into (clobbered_vops, loads);
- bitmap_ior_into (clobbered_vops, stores);
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+ mp = get_loads_and_stores (stmt);
+
+ if (mp->loads)
+ bitmap_ior_into (clobbered_vops, mp->loads);
+
+ if (mp->stores)
+ bitmap_ior_into (clobbered_vops, mp->stores);
}
/* Gathers memory references in LOOP. Notes vops accessed through unrecognized
diff --git a/gcc/tree-ssa-loop-ivopts.c b/gcc/tree-ssa-loop-ivopts.c
index 7de29e80a18..6ac799b31ac 100644
--- a/gcc/tree-ssa-loop-ivopts.c
+++ b/gcc/tree-ssa-loop-ivopts.c
@@ -5186,12 +5186,7 @@ remove_statement (tree stmt, bool including_defined_name)
{
if (TREE_CODE (stmt) == PHI_NODE)
{
- if (!including_defined_name)
- {
- /* Prevent the ssa name defined by the statement from being removed. */
- SET_PHI_RESULT (stmt, NULL);
- }
- remove_phi_node (stmt, NULL_TREE);
+ remove_phi_node (stmt, NULL_TREE, including_defined_name);
}
else
{
diff --git a/gcc/tree-ssa-loop-manip.c b/gcc/tree-ssa-loop-manip.c
index 368ab7ff39f..cf0e357d0a4 100644
--- a/gcc/tree-ssa-loop-manip.c
+++ b/gcc/tree-ssa-loop-manip.c
@@ -457,11 +457,11 @@ split_loop_exit_edge (edge exit)
/* If the argument of the phi node is a constant, we do not need
to keep it inside loop. */
- if (TREE_CODE (name) != SSA_NAME)
+ if (TREE_CODE (name) != SSA_NAME || !is_gimple_reg (name))
continue;
/* Otherwise create an auxiliary phi node that will copy the value
- of the ssa name out of the loop. */
+ of the SSA name out of the loop. */
new_name = duplicate_ssa_name (name, NULL);
new_phi = create_phi_node (new_name, bb);
SSA_NAME_DEF_STMT (new_name) = new_phi;
diff --git a/gcc/tree-ssa-loop-prefetch.c b/gcc/tree-ssa-loop-prefetch.c
index 15d3b8e3de6..b58dbf6fccd 100644
--- a/gcc/tree-ssa-loop-prefetch.c
+++ b/gcc/tree-ssa-loop-prefetch.c
@@ -46,9 +46,6 @@ Software Foundation, 59 Temple Place - Suite 330, Boston, MA
#include "params.h"
#include "langhooks.h"
-#undef SSA_NAME_VAR
-#define SSA_NAME_VAR(NODE) __extension__ ({ extern tree mem_var; const tree __t = SSA_NAME_CHECK (NODE)->ssa_name.var; gcc_assert (__t != mem_var); __t; })
-
/* This pass inserts prefetch instructions to optimize cache usage during
accesses to arrays in loops. It processes loops sequentially and:
diff --git a/gcc/tree-ssa-operands.c b/gcc/tree-ssa-operands.c
index 1ec274b5518..0dafc966ae5 100644
--- a/gcc/tree-ssa-operands.c
+++ b/gcc/tree-ssa-operands.c
@@ -162,7 +162,7 @@ static vuse_optype_p free_vuses = NULL;
static vdef_optype_p free_vdefs = NULL;
/* Sets of symbols loaded and stored by a statement. These sets are
- only used when the operand scanner is called via get_loads_and_stores. */
+ only used when the operand scanner is called via update_loads_and_stores. */
static bitmap loaded_syms = NULL;
static bitmap stored_syms = NULL;
@@ -204,6 +204,40 @@ DEF_VEC_ALLOC_P(scb_t,heap);
of changes for the popped statement. */
static VEC(scb_t,heap) *scb_stack;
+/* Table, indexed by statement, for symbols loaded and stored by
+ statements that reference memory. */
+static htab_t mem_syms_tbl;
+
+/* Hashing and equality functions for MEM_SYMS_TBL. */
+
+static hashval_t
+mem_syms_hash (const void *p)
+{
+ return htab_hash_pointer ((const void *)((const mem_syms_map_t)p)->stmt);
+}
+
+static int
+mem_syms_eq (const void *p1, const void *p2)
+{
+ return ((const mem_syms_map_t)p1)->stmt == ((const mem_syms_map_t)p2)->stmt;
+}
+
+static void
+mem_syms_free (void *p)
+{
+ bitmap loads = ((mem_syms_map_t)p)->loads;
+ bitmap stores = ((mem_syms_map_t)p)->stores;
+ bool shared_p = loads == stores;
+
+ BITMAP_FREE (loads);
+
+ /* PHI nodes share loads/stores sets. Avoid freeing them twice. */
+ if (!shared_p)
+ BITMAP_FREE (stores);
+
+ free (p);
+}
+
/* Return true if the parser should just gather symbols loaded/stored
by the statement instead of scanning for operands. */
@@ -297,9 +331,11 @@ init_ssa_operands (void)
build_vuses = VEC_alloc (tree, heap, 25);
build_vdefs = VEC_alloc (tree, heap, 25);
scb_stack = VEC_alloc (scb_t, heap, 20);
-
gcc_assert (operand_memory == NULL);
operand_memory_index = SSA_OPERAND_MEMORY_SIZE;
+ gcc_assert (mem_syms_tbl == NULL);
+ mem_syms_tbl = htab_create (200, mem_syms_hash, mem_syms_eq, mem_syms_free);
+
ops_active = true;
}
@@ -330,6 +366,9 @@ fini_ssa_operands (void)
ggc_free (ptr);
}
+ htab_delete (mem_syms_tbl);
+ mem_syms_tbl = NULL;
+
ops_active = false;
}
@@ -1049,6 +1088,12 @@ append_vuse (void)
static inline void
add_virtual_operator (int flags)
{
+ /* If no alias information exists, we are not going to build the
+ virtual SSA form, so adding a VDEF/VUSE operator will only upset
+ the SSA verifier because its operands will not be renamed. */
+ if (!aliases_computed_p)
+ return;
+
/* If we are inside an ADDR_EXPR, then VAR is not being referenced
but used to compute address arithmetic. */
if (flags & opf_no_vops)
@@ -1071,7 +1116,7 @@ add_virtual_operator (int flags)
static bool
access_can_touch_variable (tree ref, tree alias, HOST_WIDE_INT offset,
HOST_WIDE_INT size)
-{
+{
bool offsetgtz = offset > 0;
unsigned HOST_WIDE_INT uoffset = (unsigned HOST_WIDE_INT) offset;
tree base = ref ? get_base_address (ref) : NULL;
@@ -1240,9 +1285,9 @@ add_mem_symbol (tree decl, int flags)
flags &= ~opf_def;
uid = DECL_UID (decl);
- if (flags & opf_def)
+ if (stored_syms && (flags & opf_def))
bitmap_set_bit (stored_syms, uid);
- else
+ else if (loaded_syms)
bitmap_set_bit (loaded_syms, uid);
}
@@ -1331,10 +1376,6 @@ get_mem_symbols_in_indirect_ref (tree expr, int flags, tree full_ref,
tree *pptr = &TREE_OPERAND (expr, 0);
tree ptr = *pptr;
- /* If we don't have aliasing information, do nothing. */
- if (!aliases_computed_p)
- return;
-
/* No symbols referenced if PTR is not a variable. */
if (!SSA_VAR_P (ptr))
return;
@@ -1654,13 +1695,21 @@ add_stmt_operand (tree *var_p, stmt_ann_t s_ann, int flags)
/* .MEM should not be found in the code. */
gcc_assert (var != mem_var);
- /* When gathering loads and stores, look for the symbols
- associated with VAR, otherwise add the virtual operator
- indicated by FLAGS. */
+ /* Mark the statement as having memory operands. */
+ s_ann->references_memory = true;
+
if (gathering_loads_stores ())
- add_mem_symbols_in_decl (var, flags);
+ {
+ /* When gathering loads and stores, look for the symbols
+ associated with VAR, otherwise add the virtual operator
+ indicated by FLAGS. */
+ add_mem_symbols_in_decl (var, flags);
+ }
else
- add_virtual_operator (flags);
+ {
+ /* Otherwise, add a virtual operator for .MEM. */
+ add_virtual_operator (flags);
+ }
}
}
@@ -1686,6 +1735,8 @@ get_indirect_ref_operands (tree stmt, tree expr, int flags)
tree ptr = *pptr;
stmt_ann_t s_ann = stmt_ann (stmt);
+ s_ann->references_memory = true;
+
/* If the dereference is volatile, mark the statement accordingly. */
if (TREE_THIS_VOLATILE (expr))
s_ann->has_volatile_ops = true;
@@ -1700,27 +1751,20 @@ get_indirect_ref_operands (tree stmt, tree expr, int flags)
if (SSA_VAR_P (ptr))
{
- /* Only add a reference to .MEM if we have alias information
- available. Otherwise, the SSA renamer will try to access the
- aliased symbols associated with this pointer dereference and
- find nothing. */
- if (aliases_computed_p)
- {
- tree tag;
+ tree tag;
- if (TREE_CODE (ptr) == SSA_NAME
- && SSA_NAME_PTR_INFO (ptr)
- && SSA_NAME_PTR_INFO (ptr)->name_mem_tag)
- tag = SSA_NAME_PTR_INFO (ptr)->name_mem_tag;
- else
- {
- tree sym = DECL_P (ptr) ? ptr : SSA_NAME_VAR (ptr);
- tag = var_ann (sym)->symbol_mem_tag;
- }
-
- if (tag)
- add_virtual_operator (flags);
+ if (TREE_CODE (ptr) == SSA_NAME
+ && SSA_NAME_PTR_INFO (ptr)
+ && SSA_NAME_PTR_INFO (ptr)->name_mem_tag)
+ tag = SSA_NAME_PTR_INFO (ptr)->name_mem_tag;
+ else
+ {
+ tree sym = DECL_P (ptr) ? ptr : SSA_NAME_VAR (ptr);
+ tag = var_ann (sym)->symbol_mem_tag;
}
+
+ if (tag)
+ add_virtual_operator (flags);
}
else if (TREE_CODE (ptr) == INTEGER_CST)
{
@@ -1749,6 +1793,9 @@ static void
get_tmr_operands (tree stmt, tree expr, int flags)
{
tree tag;
+ stmt_ann_t ann = stmt_ann (stmt);
+
+ ann->references_memory = 1;
/* When gathering memory symbols, there is no need to keep analyzing
the rest of the expression. */
@@ -1763,10 +1810,7 @@ get_tmr_operands (tree stmt, tree expr, int flags)
get_expr_operands (stmt, &TMR_INDEX (expr), opf_use);
if (TMR_SYMBOL (expr))
- {
- stmt_ann_t ann = stmt_ann (stmt);
- add_to_addressable_set (TMR_SYMBOL (expr), &ann->addresses_taken);
- }
+ add_to_addressable_set (TMR_SYMBOL (expr), &ann->addresses_taken);
tag = TMR_TAG (expr);
if (!tag)
@@ -1788,6 +1832,8 @@ get_call_expr_operands (tree stmt, tree expr)
tree op;
int call_flags = call_expr_flags (expr);
+ stmt_ann (stmt)->references_memory = true;
+
if (!bitmap_empty_p (call_clobbered_vars) && !(call_flags & ECF_NOVOPS))
{
/* A 'pure' or a 'const' function never call-clobbers anything.
@@ -1881,6 +1927,8 @@ get_asm_expr_operands (tree stmt)
unsigned i;
bitmap_iterator bi;
+ s_ann->references_memory = true;
+
EXECUTE_IF_SET_IN_BITMAP (call_clobbered_vars, 0, i, bi)
{
tree var;
@@ -2246,9 +2294,13 @@ build_ssa_operands (tree stmt)
{
stmt_ann_t ann = get_stmt_ann (stmt);
- /* Initially assume that the statement has no volatile operands. */
+ /* Initially assume that the statement has no volatile operands and
+ makes no memory references. */
if (ann)
- ann->has_volatile_ops = false;
+ {
+ ann->has_volatile_ops = false;
+ ann->references_memory = false;
+ }
start_ssa_stmt_operands ();
@@ -2257,6 +2309,11 @@ build_ssa_operands (tree stmt)
operand_build_sort_virtual (build_vdefs);
finalize_ssa_stmt_operands (stmt);
+
+ /* For added safety, assume that statements with volatile operands
+ also reference memory. */
+ if (ann->has_volatile_ops)
+ ann->references_memory = true;
}
@@ -2272,6 +2329,182 @@ free_ssa_operands (stmt_operands_p ops)
}
+/* Lookup in MEM_SYMS_TBL a slot corresponding to STMT. Create a new
+ one if needed. */
+
+mem_syms_map_t
+get_loads_and_stores (tree stmt)
+{
+ struct mem_syms_map_d m, *mp;
+ void **slot;
+
+ gcc_assert (mem_syms_tbl);
+ gcc_assert (stmt_references_memory_p (stmt));
+
+ m.stmt = stmt;
+ slot = htab_find_slot (mem_syms_tbl, (void *) &m, INSERT);
+ if (*slot == NULL)
+ {
+ mp = XNEW (struct mem_syms_map_d);
+ mp->stmt = stmt;
+ mp->loads = BITMAP_ALLOC (NULL);
+
+ if (TREE_CODE (stmt) == PHI_NODE)
+ {
+ tree lhs_sym;
+
+ /* PHI nodes load and store the same symbols. */
+ mp->stores = mp->loads;
+
+ /* Non-factored PHI nodes load and store exactly one symbol. */
+ lhs_sym = SSA_NAME_VAR (PHI_RESULT (stmt));
+ if (lhs_sym != mem_var)
+ bitmap_set_bit (mp->stores, DECL_UID (lhs_sym));
+ }
+ else
+ mp->stores = BITMAP_ALLOC (NULL);
+
+ *slot = (void *) mp;
+ }
+ else
+ mp = (mem_syms_map_t) *slot;
+
+ return mp;
+}
+
+
+/* Delete the loads and stores sets associated with STMT. */
+
+void
+delete_loads_and_stores (tree stmt)
+{
+ if (mem_syms_tbl)
+ {
+ struct mem_syms_map_d m;
+ void **slot;
+
+ m.stmt = stmt;
+ slot = htab_find_slot (mem_syms_tbl, (void *) &m, NO_INSERT);
+ if (slot)
+ htab_remove_elt (mem_syms_tbl, *slot);
+ }
+}
+
+
+/* Add LOADS and STORES to the corresponding sets in STMT.
+
+ NOTE: Not to be used lightly. This is generally used by the SSA
+ renamer to initialize the factored symbols sets for factoring
+ PHI nodes. */
+
+void
+add_loads_and_stores (tree stmt, bitmap loads, bitmap stores)
+{
+ mem_syms_map_t mp = get_loads_and_stores (stmt);
+
+ if (loads)
+ bitmap_ior_into (mp->loads, loads);
+
+ if (stores)
+ bitmap_ior_into (mp->stores, stores);
+}
+
+
+/* Move the sets of loads and stores from OLD_STMT into NEW_STMT.
+ Remove the entry for OLD_STMT. */
+
+void
+move_loads_and_stores (tree new_stmt, tree old_stmt)
+{
+ void **slot;
+ mem_syms_map_t old_mp, new_mp;
+
+ gcc_assert (TREE_CODE (new_stmt) == TREE_CODE (old_stmt));
+
+ old_mp = get_loads_and_stores (old_stmt);
+ new_mp = get_loads_and_stores (new_stmt);
+
+ bitmap_copy (new_mp->loads, old_mp->loads);
+
+ /* PHI nodes share their loads and stores sets. */
+ if (TREE_CODE (new_stmt) != PHI_NODE)
+ bitmap_copy (new_mp->stores, old_mp->stores);
+
+ slot = htab_find_slot (mem_syms_tbl, (void *) old_mp, NO_INSERT);
+ htab_remove_elt (mem_syms_tbl, *slot);
+}
+
+
+/* Update the sets of symbols loaded and stored by STMT.
+
+ FIXME, if the statement has a store, we should use the SSA name on
+ its LHS as the index into an array. Only use hashing for
+ statements that only have a load operation. */
+
+static void
+update_loads_and_stores (tree stmt)
+{
+ mem_syms_map_t mp;
+
+ memset (&clobber_stats, 0, sizeof (clobber_stats));
+
+ mp = get_loads_and_stores (stmt);
+
+ if (mp->loads)
+ bitmap_clear (mp->loads);
+ else
+ mp->loads = BITMAP_ALLOC (NULL);
+
+ if (mp->stores)
+ bitmap_clear (mp->stores);
+ else
+ mp->stores = BITMAP_ALLOC (NULL);
+
+ /* Point the internal loaded/stored sets to the ones provided. */
+ loaded_syms = mp->loads;
+ stored_syms = mp->stores;
+
+ /* Parse the statement. We don't really care for its operands, so
+ there's no need to initialize anything. If any operand was added
+ to the cache, it is discarded.
+
+ FIXME, unnecessary. A single call to parse_ssa_operands should
+ do everything. */
+ gcc_assert (loaded_syms || stored_syms);
+ parse_ssa_operands (stmt);
+ truncate_ssa_stmt_operands ();
+
+ /* We don't need the symbol sets anymore. */
+ loaded_syms = NULL;
+ stored_syms = NULL;
+
+ if (dump_file && (dump_flags & TDF_DETAILS))
+ dump_loads_and_stores (dump_file, stmt);
+
+ if (mp->loads && bitmap_empty_p (mp->loads))
+ BITMAP_FREE (mp->loads);
+
+ if (mp->stores && bitmap_empty_p (mp->stores))
+ BITMAP_FREE (mp->stores);
+
+ if (dump_file && (dump_flags & TDF_STATS))
+ {
+ fprintf (dump_file, "Original clobbered vars: %d\n",
+ clobber_stats.clobbered_vars);
+ fprintf (dump_file, "Static write clobbers avoided: %d\n",
+ clobber_stats.static_write_clobbers_avoided);
+ fprintf (dump_file, "Static read clobbers avoided: %d\n",
+ clobber_stats.static_read_clobbers_avoided);
+ fprintf (dump_file, "Unescapable clobbers avoided: %d\n",
+ clobber_stats.unescapable_clobbers_avoided);
+ fprintf (dump_file, "Original read-only clobbers: %d\n",
+ clobber_stats.readonly_clobbers);
+ fprintf (dump_file, "Static read-only clobbers avoided: %d\n",
+ clobber_stats.static_readonly_clobbers_avoided);
+ }
+}
+
+
/* Get the operands of statement STMT. */
void
@@ -2288,18 +2521,23 @@ update_stmt_operands (tree stmt)
_DECL. This indicates a bug in the gimplifier. */
gcc_assert (!SSA_VAR_P (stmt));
- gcc_assert (ann->modified);
-
- /* This cannot be called from get_loads_and_stores. */
+ /* This cannot be called from update_loads_and_stores. */
gcc_assert (!gathering_loads_stores ());
timevar_push (TV_TREE_OPS);
+ gcc_assert (ann->modified);
build_ssa_operands (stmt);
-
- /* Clear the modified bit for STMT. */
ann->modified = 0;
+ /* If the statement makes memory references, collect symbols
+ referenced by it.
+
+ FIXME, we should stop scanning the statement twice. With a
+ single scan we should get both operands and symbols. */
+ if (stmt_references_memory_p (stmt))
+ update_loads_and_stores (stmt);
+
timevar_pop (TV_TREE_OPS);
}
@@ -2626,157 +2864,42 @@ debug_immediate_uses_for (tree var)
dump_immediate_uses_for (stderr, var);
}
-void debug_loads_and_stores (tree);
-void dump_loads_and_stores (FILE *, tree, bitmap, bitmap);
-
-/* Dump symbols loaded and stored by STMT to FILE. If LOADS and
- STORES are given, the statement is not scanned. */
+/* Dump symbols loaded and stored by STMT to FILE. */
void
-dump_loads_and_stores (FILE *file, tree stmt, bitmap loads, bitmap stores)
+dump_loads_and_stores (FILE *file, tree stmt)
{
- unsigned i;
- bitmap_iterator bi;
- bool free_memory_p = false;
-
- if (loads == NULL && stores == NULL)
- {
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, loads, stores);
- free_memory_p = true;
- }
+ mem_syms_map_t mp;
- fprintf (file, "\n\n");
print_generic_stmt (file, stmt, TDF_VOPS);
- fprintf (file, "\tLOADS: { ");
- EXECUTE_IF_SET_IN_BITMAP (loads, 0, i, bi)
- {
- print_generic_expr (file, referenced_var (i), 0);
- fprintf (file, " ");
- }
- fprintf (file, "}\n");
- fprintf (file, "\tSTORES: { ");
- EXECUTE_IF_SET_IN_BITMAP (stores, 0, i, bi)
- {
- print_generic_expr (file, referenced_var (i), 0);
- fprintf (file, " ");
- }
- fprintf (file, "}\n");
+ mp = get_loads_and_stores (stmt);
- if (free_memory_p)
+ if (TREE_CODE (stmt) == PHI_NODE)
{
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+ fprintf (file, "\tLOADS/STORES: ");
+ dump_decl_set (file, mp->loads);
+ if (mp->loads != mp->stores)
+ fprintf (file, "\n\tWARNING. LOADS and STORES sets not shared.\n");
}
-}
-
-
-/* Dump symbols loaded and stored by STMT to stderr. */
-
-void
-debug_loads_and_stores (tree stmt)
-{
- dump_loads_and_stores (stderr, stmt, NULL, NULL);
-}
-
-
-/* Collect all the symbols loaded and stored by the arguments of PHI
- node PHI. Store the sets in LOADS and STORES respectively. */
-
-static void
-get_loads_and_stores_for_phi (tree phi, bitmap loads, bitmap stores)
-{
- int i;
- tree lhs = PHI_RESULT (phi);
-
- gcc_assert (!is_gimple_reg (lhs));
-
- for (i = 0; i < PHI_NUM_ARGS (phi); i++)
+ else
{
- tree sym, arg;
-
- arg = PHI_ARG_DEF (phi, i);
+ fprintf (file, "\tLOADS: ");
+ dump_decl_set (file, mp->loads);
- /* Avoid infinite recursion. */
- if (arg == lhs)
- continue;
-
- sym = SSA_NAME_VAR (arg);
-
- if (sym == mem_var)
- {
- /* Recurse for an memory factored SSA name. */
- get_loads_and_stores (SSA_NAME_DEF_STMT (arg), loads, stores);
- }
- else
- {
- /* Otherwise, this PHI node will both load and store the
- underlying symbol for ARG. */
- bitmap_set_bit (loads, DECL_UID (sym));
- bitmap_set_bit (stores, DECL_UID (sym));
- }
+ fprintf (file, "\tSTORES: ");
+ dump_decl_set (file, mp->stores);
}
}
-
-
-/* Add to LOADED the set of symbols loaded by STMT and add to STORED
- the set of symbols stored by STMT. Both sets must be allocated by
- the caller. FIXME, should allow for either set to be NULL. */
+/* Dump symbols loaded and stored by STMT to stderr. */
void
-get_loads_and_stores (tree stmt, bitmap loads, bitmap stores)
+debug_loads_and_stores (tree stmt)
{
- gcc_assert (loads && stores);
- memset (&clobber_stats, 0, sizeof (clobber_stats));
-
- if (TREE_CODE (stmt) != PHI_NODE)
- {
- gcc_assert (stmt_references_memory_p (stmt));
-
- /* Point the internal loaded/stored sets to the ones provided. */
- loaded_syms = loads;
- stored_syms = stores;
-
- /* Parse the statement. We don't really care for its operands, so
- there's no need to initialize anything. If any operand was added
- to the cache, it is discarded. */
- parse_ssa_operands (stmt);
- truncate_ssa_stmt_operands ();
-
- /* We don't need the symbol sets anymore. */
- loaded_syms = NULL;
- stored_syms = NULL;
- }
- else
- {
- /* PHI nodes need special treatment. A PHI node loads/stores
- all the symbols loaded/stored by its arguments. */
- get_loads_and_stores_for_phi (stmt, loads, stores);
- }
-
- if (dump_file && (dump_flags & TDF_DETAILS))
- dump_loads_and_stores (dump_file, stmt, loads, stores);
-
- if (dump_file && (dump_flags & TDF_STATS))
- {
- fprintf (dump_file, "Original clobbered vars: %d\n",
- clobber_stats.clobbered_vars);
- fprintf (dump_file, "Static write clobbers avoided: %d\n",
- clobber_stats.static_write_clobbers_avoided);
- fprintf (dump_file, "Static read clobbers avoided: %d\n",
- clobber_stats.static_read_clobbers_avoided);
- fprintf (dump_file, "Unescapable clobbers avoided: %d\n",
- clobber_stats.unescapable_clobbers_avoided);
- fprintf (dump_file, "Original read-only clobbers: %d\n",
- clobber_stats.readonly_clobbers);
- fprintf (dump_file, "Static read-only clobbers avoided: %d\n",
- clobber_stats.static_readonly_clobbers_avoided);
- }
+ dump_loads_and_stores (stderr, stmt);
}
@@ -2806,38 +2929,50 @@ push_stmt_changes (tree *stmt_p)
if (stmt_references_memory_p (stmt))
{
+ mem_syms_map_t mp;
+
buf->loads = BITMAP_ALLOC (NULL);
buf->stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, buf->loads, buf->stores);
+
+ mp = get_loads_and_stores (stmt);
+
+ if (mp->loads)
+ bitmap_copy (buf->loads, mp->loads);
+
+ if (mp->stores)
+ bitmap_copy (buf->stores, mp->stores);
}
VEC_safe_push (scb_t, heap, scb_stack, buf);
}
-/* Given two sets S1 and S2, mark the symbols in S1 U S2 for renaming
- if S1 != S2. Sets S1 and S2 may be overwritten by this function. */
+/* Given two sets S1 and S2, mark the symbols that differ in S1 and S2
+ for renaming. The set to mark for renaming is (S1 & ~S2) | (S2 & ~S1). */
static void
mark_difference_for_renaming (bitmap s1, bitmap s2)
{
- bitmap tmp = NULL;
-
if (s1 == NULL && s2 == NULL)
return;
if (s1 && s2 == NULL)
- tmp = s1;
+ mark_set_for_renaming (s1);
else if (s1 == NULL && s2)
- tmp = s2;
+ mark_set_for_renaming (s2);
else if (!bitmap_equal_p (s1, s2))
{
- tmp = s1;
- bitmap_ior_into (tmp, s2);
- }
+ bitmap t1 = BITMAP_ALLOC (NULL);
+ bitmap t2 = BITMAP_ALLOC (NULL);
+
+ bitmap_and_compl (t1, s1, s2);
+ bitmap_and_compl (t2, s2, s1);
+ bitmap_ior_into (t1, t2);
+ mark_set_for_renaming (t1);
- if (tmp)
- mark_set_for_renaming (tmp);
+ BITMAP_FREE (t1);
+ BITMAP_FREE (t2);
+ }
}
@@ -2850,6 +2985,7 @@ pop_stmt_changes (tree *stmt_p)
{
tree op, stmt;
ssa_op_iter iter;
+ mem_syms_map_t mp;
bitmap loads, stores;
scb_t buf;
@@ -2872,9 +3008,9 @@ pop_stmt_changes (tree *stmt_p)
loads = stores = NULL;
if (stmt_references_memory_p (stmt))
{
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, loads, stores);
+ mp = get_loads_and_stores (stmt);
+ loads = mp->loads;
+ stores = mp->stores;
}
/* If LOADS is different from BUF->LOADS, the affected
@@ -2908,9 +3044,24 @@ pop_stmt_changes (tree *stmt_p)
BITMAP_FREE (buf->stores);
buf->stmt_p = NULL;
free (buf);
+}
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+
+/* Returns true if statement STMT may access memory. */
+
+bool
+stmt_references_memory_p (tree stmt)
+{
+ if (!ops_active)
+ return false;
+
+ if (TREE_CODE (stmt) == PHI_NODE)
+ {
+ tree lhs_sym = SSA_NAME_VAR (PHI_RESULT (stmt));
+ return lhs_sym == mem_var || !is_gimple_reg (lhs_sym);
+ }
+
+ return stmt_ann (stmt)->references_memory;
}
#include "gt-tree-ssa-operands.h"
diff --git a/gcc/tree-ssa-operands.h b/gcc/tree-ssa-operands.h
index 9deee154f4e..4340f0d1190 100644
--- a/gcc/tree-ssa-operands.h
+++ b/gcc/tree-ssa-operands.h
@@ -188,11 +188,14 @@ extern void dump_immediate_uses (FILE *file);
extern void dump_immediate_uses_for (FILE *file, tree var);
extern void debug_immediate_uses (void);
extern void debug_immediate_uses_for (tree var);
+extern void debug_loads_and_stores (tree);
+extern void dump_loads_and_stores (FILE *, tree);
+extern void dump_decl_set (FILE *, bitmap);
+extern void debug_decl_set (bitmap);
extern bool ssa_operands_active (void);
extern void add_to_addressable_set (tree, bitmap *);
-extern void get_loads_and_stores (tree, bitmap, bitmap);
extern void push_stmt_changes (tree *);
extern void pop_stmt_changes (tree *);
@@ -324,4 +327,19 @@ typedef struct ssa_operand_iterator_d
/* This macro counts the number of operands in STMT matching FLAGS. */
#define NUM_SSA_OPERANDS(STMT, FLAGS) num_ssa_operands (STMT, FLAGS)
+/* Mapping between a statement and the symbols referenced by it. */
+struct mem_syms_map_d
+{
+ tree stmt;
+ bitmap loads;
+ bitmap stores;
+};
+
+typedef struct mem_syms_map_d *mem_syms_map_t;
+
+extern mem_syms_map_t get_loads_and_stores (tree);
+extern void add_loads_and_stores (tree, bitmap, bitmap);
+extern void move_loads_and_stores (tree, tree);
+extern void delete_loads_and_stores (tree);
+
#endif /* GCC_TREE_SSA_OPERANDS_H */
diff --git a/gcc/tree-ssa-pre.c b/gcc/tree-ssa-pre.c
index a95f921f7d5..d54a4354f6a 100644
--- a/gcc/tree-ssa-pre.c
+++ b/gcc/tree-ssa-pre.c
@@ -3695,7 +3695,7 @@ remove_dead_inserted_code (void)
if (TREE_CODE (t) == PHI_NODE)
{
- remove_phi_node (t, NULL);
+ remove_phi_node (t, NULL, true);
}
else
{
diff --git a/gcc/tree-ssa-structalias.c b/gcc/tree-ssa-structalias.c
index 45433534863..3f16fe7a924 100644
--- a/gcc/tree-ssa-structalias.c
+++ b/gcc/tree-ssa-structalias.c
@@ -3159,12 +3159,9 @@ update_alias_info (tree stmt, struct alias_info *ai)
/* Mark variables in VDEF operands as being written to. */
if (stmt_references_memory_p (stmt))
{
- bitmap loads = BITMAP_ALLOC (NULL);
- bitmap stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, loads, stores);
- bitmap_ior_into (ai->written_vars, stores);
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+ mem_syms_map_t mp = get_loads_and_stores (stmt);
+ if (mp->stores)
+ bitmap_ior_into (ai->written_vars, mp->stores);
}
}
@@ -4489,10 +4486,10 @@ compute_points_to_sets (struct alias_info *ai)
{
tree stmt = bsi_stmt (bsi);
find_func_aliases (stmt);
- /* Update various related attributes like escaped
- addresses, pointer dereferences for loads and stores.
- This is used when creating name tags and alias
- sets. */
+
+ /* Update various related attributes like escaped addresses,
+ pointer dereferences for loads and stores. This is used
+ when creating name tags and alias sets. */
update_alias_info (stmt, ai);
}
}
diff --git a/gcc/tree-ssa.c b/gcc/tree-ssa.c
index b20f60032a6..bd3b93ccf0c 100644
--- a/gcc/tree-ssa.c
+++ b/gcc/tree-ssa.c
@@ -76,7 +76,7 @@ ssa_redirect_edge (edge e, basic_block dest)
return e;
}
-/* Add PHI arguments queued in PENDINT_STMT list on edge E to edge
+/* Add PHI arguments queued in PENDING_STMT list on edge E to edge
E->dest. */
void
@@ -136,12 +136,23 @@ verify_ssa_name (tree ssa_name, bool is_virtual)
return true;
}
+#if 0
if (is_virtual && var_ann (SSA_NAME_VAR (ssa_name))
&& get_subvars_for_var (SSA_NAME_VAR (ssa_name)) != NULL)
{
error ("found real variable when subvariables should have appeared");
return true;
}
+#endif
+
+ /* All the memory SSA names must use the same default definition. */
+ if (is_virtual
+ && IS_EMPTY_STMT (SSA_NAME_DEF_STMT (ssa_name))
+ && SSA_NAME_VAR (ssa_name) != mem_var)
+ {
+ error ("the default definition of a memory symbol should be .MEM_0(D)");
+ return true;
+ }
return false;
}
@@ -252,6 +263,7 @@ verify_use (basic_block bb, basic_block def_bb, use_operand_p use_p,
}
if (check_abnormal
+ && ssa_name != default_def (mem_var)
&& !SSA_NAME_OCCURS_IN_ABNORMAL_PHI (ssa_name))
{
error ("SSA_NAME_OCCURS_IN_ABNORMAL_PHI should be set");
@@ -294,16 +306,23 @@ verify_use (basic_block bb, basic_block def_bb, use_operand_p use_p,
/* Return true if any of the arguments for PHI node PHI at block BB is
malformed.
- DEFINITION_BLOCK is an array of basic blocks indexed by SSA_NAME version
- numbers. If DEFINITION_BLOCK[SSA_NAME_VERSION] is set, it means that the
- block in that array slot contains the definition of SSA_NAME. */
+ DEFINITION_BLOCK is an array of basic blocks indexed by SSA_NAME
+ version numbers. If DEFINITION_BLOCK[SSA_NAME_VERSION] is set,
+ it means that the block in that array slot contains the
+ definition of SSA_NAME.
+
+ SYMS_IN_FACTORED_PHIS contains the set of collected symbols found
+ in factored PHI nodes. No symbol should be in more than one
+ factored PHI node. */
static bool
-verify_phi_args (tree phi, basic_block bb, basic_block *definition_block)
+verify_phi_args (tree phi, basic_block bb, basic_block *definition_block,
+ bitmap syms_in_factored_phis)
{
edge e;
bool err = false;
unsigned i, phi_num_args = PHI_NUM_ARGS (phi);
+ mem_syms_map_t syms_phi = NULL;
if (EDGE_COUNT (bb->preds) != phi_num_args)
{
@@ -312,12 +331,53 @@ verify_phi_args (tree phi, basic_block bb, basic_block *definition_block)
goto error;
}
+ if (stmt_references_memory_p (phi))
+ {
+ bitmap_iterator bi;
+ unsigned i;
+
+ syms_phi = get_loads_and_stores (phi);
+ if (syms_phi->loads != syms_phi->stores)
+ {
+ error ("Sets of loads and stores should be shared");
+ err = true;
+ goto error;
+ }
+
+ EXECUTE_IF_SET_IN_BITMAP (syms_phi->stores, 0, i, bi)
+ if (is_gimple_reg (referenced_var (i)))
+ {
+ error ("Found GIMPLE register in set of stored memory symbols");
+ fprintf (stderr, "Symbol: ");
+ print_generic_stmt (stderr, referenced_var (i), 0);
+ err = true;
+ }
+
+ /* If this PHI node has no uses, then it doesn't matter if it
+ factors symbols in common with another PHI node. */
+ if (num_imm_uses (PHI_RESULT (phi)) > 0
+ && bitmap_intersect_p (syms_phi->stores, syms_in_factored_phis))
+ {
+ bitmap tmp = BITMAP_ALLOC (NULL);
+ bitmap_and (tmp, syms_phi->stores, syms_in_factored_phis);
+ error ("Found symbols factored by more than one PHI node in the "
+ "same block");
+ dump_decl_set (stderr, tmp);
+ BITMAP_FREE (tmp);
+ err = true;
+ }
+ else
+ bitmap_ior_into (syms_in_factored_phis, syms_phi->stores);
+
+ if (err)
+ goto error;
+ }
+
for (i = 0; i < phi_num_args; i++)
{
use_operand_p op_p = PHI_ARG_DEF_PTR (phi, i);
tree op = USE_FROM_PTR (op_p);
-
e = EDGE_PRED (bb, i);
if (op == NULL_TREE)
@@ -336,8 +396,11 @@ verify_phi_args (tree phi, basic_block bb, basic_block *definition_block)
}
if (TREE_CODE (op) == SSA_NAME)
- err = verify_use (e->src, definition_block[SSA_NAME_VERSION (op)], op_p,
- phi, e->flags & EDGE_ABNORMAL,
+ err = verify_use (e->src,
+ definition_block[SSA_NAME_VERSION (op)],
+ op_p,
+ phi,
+ e->flags & EDGE_ABNORMAL,
!is_gimple_reg (PHI_RESULT (phi)),
NULL);
@@ -360,7 +423,7 @@ error:
if (err)
{
fprintf (stderr, "for PHI node\n");
- print_generic_stmt (stderr, phi, TDF_VOPS);
+ print_generic_stmt (stderr, phi, TDF_VOPS|TDF_MEMSYMS);
}
@@ -713,6 +776,7 @@ verify_ssa (bool check_modified_stmt)
tree phi;
edge_iterator ei;
block_stmt_iterator bsi;
+ bitmap factored_syms;
/* Make sure that all edges have a clear 'aux' field. */
FOR_EACH_EDGE (e, ei, bb->preds)
@@ -726,14 +790,21 @@ verify_ssa (bool check_modified_stmt)
}
/* Verify the arguments for every PHI node in the block. */
+ factored_syms = BITMAP_ALLOC (NULL);
for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
{
- if (verify_phi_args (phi, bb, definition_block))
- goto err;
+ if (verify_phi_args (phi, bb, definition_block, factored_syms))
+ {
+ BITMAP_FREE (factored_syms);
+ goto err;
+ }
+
bitmap_set_bit (names_defined_in_bb,
SSA_NAME_VERSION (PHI_RESULT (phi)));
}
+ BITMAP_FREE (factored_syms);
+
/* Now verify all the uses and vuses in every statement of the block. */
for (bsi = bsi_start (bb); !bsi_end_p (bsi); bsi_next (&bsi))
{
@@ -785,7 +856,8 @@ verify_ssa (bool check_modified_stmt)
}
/* Finally, verify alias information. */
- verify_alias_info ();
+ if (aliases_computed_p)
+ verify_alias_info ();
free (definition_block);
@@ -1032,20 +1104,6 @@ tree_ssa_useless_type_conversion (tree expr)
}
-/* Returns true if statement STMT may access memory. */
-
-bool
-stmt_references_memory_p (tree stmt)
-{
- stmt_ann_t ann = stmt_ann (stmt);
-
- if (ann->has_volatile_ops)
- return true;
-
- return (!ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS));
-}
-
-
/* Internal helper for walk_use_def_chains. VAR, FN and DATA are as
described in walk_use_def_chains.
@@ -1091,7 +1149,10 @@ walk_use_def_chains_1 (tree var, walk_use_def_chains_fn fn, void *data,
for (i = 0; i < PHI_NUM_ARGS (def_stmt); i++)
{
tree arg = PHI_ARG_DEF (def_stmt, i);
- if (TREE_CODE (arg) == SSA_NAME
+
+ /* ARG may be NULL for newly introduced PHI nodes. */
+ if (arg
+ && TREE_CODE (arg) == SSA_NAME
&& walk_use_def_chains_1 (arg, fn, data, visited, is_dfs))
return true;
}
@@ -1129,7 +1190,6 @@ walk_use_def_chains_1 (tree var, walk_use_def_chains_fn fn, void *data,
If IS_DFS is false, the two steps above are done in reverse order
(i.e., a breadth-first search). */
-
void
walk_use_def_chains (tree var, walk_use_def_chains_fn fn, void *data,
bool is_dfs)
diff --git a/gcc/tree-ssanames.c b/gcc/tree-ssanames.c
index 8ef4c44c94e..5077808947b 100644
--- a/gcc/tree-ssanames.c
+++ b/gcc/tree-ssanames.c
@@ -304,7 +304,6 @@ release_defs (tree stmt)
void
replace_ssa_name_symbol (tree ssa_name, tree sym)
{
- gcc_assert (sym != mem_var && SSA_NAME_VAR (ssa_name) != mem_var);
SSA_NAME_VAR (ssa_name) = sym;
TREE_TYPE (ssa_name) = TREE_TYPE (sym);
}
diff --git a/gcc/tree-stdarg.c b/gcc/tree-stdarg.c
index 110c9f164b4..2de2f2aff70 100644
--- a/gcc/tree-stdarg.c
+++ b/gcc/tree-stdarg.c
@@ -517,28 +517,23 @@ check_all_va_list_escapes (struct stdarg_info *si)
{
tree stmt = bsi_stmt (i), use;
ssa_op_iter iter;
+ bool loads_escape_p;
+
+ loads_escape_p = false;
+ if (stmt_references_memory_p (stmt))
+ {
+ mem_syms_map_t mp = get_loads_and_stores (stmt);
+ loads_escape_p = mp->loads
+ && bitmap_intersect_p (si->va_list_escape_vars,
+ mp->loads);
+ }
FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter, SSA_OP_ALL_USES)
{
tree sym = SSA_NAME_VAR (use);
- if (sym == mem_var)
- {
- if (loads == NULL)
- {
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
- }
- else
- {
- bitmap_clear (loads);
- bitmap_clear (stores);
- }
-
- get_loads_and_stores (stmt, loads, stores);
- if (!bitmap_intersect_p (si->va_list_escape_vars, loads))
- continue;
- }
+ if (sym == mem_var && loads_escape_p)
+ continue;
else if (!bitmap_bit_p (si->va_list_escape_vars, DECL_UID (sym)))
continue;
diff --git a/gcc/tree-vect-transform.c b/gcc/tree-vect-transform.c
index 1184aa5da2e..fb4f26d7530 100644
--- a/gcc/tree-vect-transform.c
+++ b/gcc/tree-vect-transform.c
@@ -1669,7 +1669,7 @@ vectorizable_store (tree stmt, block_stmt_iterator *bsi, tree *vec_stmt)
ssa_op_iter iter;
tree def, def_stmt;
enum vect_def_type dt;
- bitmap loads, stores;
+ mem_syms_map_t mp;
/* Is vectorizable store? */
@@ -1736,13 +1736,9 @@ vectorizable_store (tree stmt, block_stmt_iterator *bsi, tree *vec_stmt)
FOR_EACH_SSA_TREE_OPERAND (def, stmt, iter, SSA_OP_VDEF)
SSA_NAME_DEF_STMT (def) = *vec_stmt;
- loads = BITMAP_ALLOC (NULL);
- stores = BITMAP_ALLOC (NULL);
- get_loads_and_stores (stmt, loads, stores);
- mark_set_for_renaming (loads);
- mark_set_for_renaming (stores);
- BITMAP_FREE (loads);
- BITMAP_FREE (stores);
+ mp = get_loads_and_stores (stmt);
+ mark_set_for_renaming (mp->loads);
+ mark_set_for_renaming (mp->stores);
return true;
}
@@ -2976,8 +2972,6 @@ vect_transform_loop (loop_vec_info loop_vinfo,
int i;
tree ratio = NULL;
int vectorization_factor = LOOP_VINFO_VECT_FACTOR (loop_vinfo);
- bitmap_iterator bi;
- unsigned int j;
if (vect_print_dump_info (REPORT_DETAILS))
fprintf (vect_dump, "=== vec_transform_loop ===");
@@ -3040,7 +3034,7 @@ vect_transform_loop (loop_vec_info loop_vinfo,
/* CHECKME: we wouldn't need this if we called update_ssa once
for all loops. */
- bitmap_zero (vect_vnames_to_rename);
+ bitmap_zero (vect_memsyms_to_rename);
/* Peel the loop if there are data refs with unknown alignment.
Only one data ref with unknown store is allowed. */
@@ -3127,8 +3121,7 @@ vect_transform_loop (loop_vec_info loop_vinfo,
slpeel_make_loop_iterate_ntimes (loop, ratio);
- EXECUTE_IF_SET_IN_BITMAP (vect_vnames_to_rename, 0, j, bi)
- mark_sym_for_renaming (SSA_NAME_VAR (ssa_name (j)));
+ mark_set_for_renaming (vect_memsyms_to_rename);
/* The memory tags and pointers in vectorized statements need to
have their SSA forms updated. FIXME, why can't this be delayed
diff --git a/gcc/tree-vectorizer.c b/gcc/tree-vectorizer.c
index e0b61276980..dabc6ec98ea 100644
--- a/gcc/tree-vectorizer.c
+++ b/gcc/tree-vectorizer.c
@@ -182,7 +182,7 @@ unsigned int vect_loops_num;
static LOC vect_loop_location;
/* Bitmap of virtual variables to be renamed. */
-bitmap vect_vnames_to_rename;
+bitmap vect_memsyms_to_rename;
/*************************************************************************
Simple Loop Peeling Utilities
@@ -526,20 +526,29 @@ slpeel_update_phi_nodes_for_guard1 (edge guard_edge, struct loop *loop,
orig_phi && update_phi;
orig_phi = PHI_CHAIN (orig_phi), update_phi = PHI_CHAIN (update_phi))
{
- /* Virtual phi; Mark it for renaming. We actually want to call
+ tree sym;
+
+ /* Virtual PHI. Mark it for renaming. We actually want to call
mar_sym_for_renaming, but since all ssa renaming datastructures
- are going to be freed before we get to call ssa_upate, we just
+ are going to be freed before we get to call update_ssa, we just
record this name for now in a bitmap, and will mark it for
renaming later. */
name = PHI_RESULT (orig_phi);
- if (!is_gimple_reg (SSA_NAME_VAR (name)))
- bitmap_set_bit (vect_vnames_to_rename, SSA_NAME_VERSION (name));
+ sym = SSA_NAME_VAR (name);
+ if (sym != mem_var && !is_gimple_reg (sym))
+ bitmap_set_bit (vect_memsyms_to_rename, DECL_UID (sym));
/** 1. Handle new-merge-point phis **/
/* 1.1. Generate new phi node in NEW_MERGE_BB: */
- new_phi = create_phi_node (SSA_NAME_VAR (PHI_RESULT (orig_phi)),
- new_merge_bb);
+ sym = SSA_NAME_VAR (PHI_RESULT (orig_phi));
+ if (sym == mem_var)
+ {
+ bitmap syms = get_loads_and_stores (orig_phi)->stores;
+ new_phi = create_factored_phi_node (sym, new_merge_bb, syms);
+ }
+ else
+ new_phi = create_phi_node (sym, new_merge_bb);
/* 1.2. NEW_MERGE_BB has two incoming edges: GUARD_EDGE and the exit-edge
of LOOP. Set the two phi args in NEW_PHI for these edges: */
@@ -556,45 +565,54 @@ slpeel_update_phi_nodes_for_guard1 (edge guard_edge, struct loop *loop,
update_phi2 = new_phi;
- /** 2. Handle loop-closed-ssa-form phis **/
-
- /* 2.1. Generate new phi node in NEW_EXIT_BB: */
- new_phi = create_phi_node (SSA_NAME_VAR (PHI_RESULT (orig_phi)),
- *new_exit_bb);
-
- /* 2.2. NEW_EXIT_BB has one incoming edge: the exit-edge of the loop. */
- add_phi_arg (new_phi, loop_arg, loop->single_exit);
-
- /* 2.3. Update phi in successor of NEW_EXIT_BB: */
- gcc_assert (PHI_ARG_DEF_FROM_EDGE (update_phi2, new_exit_e) == loop_arg);
- SET_PHI_ARG_DEF (update_phi2, new_exit_e->dest_idx, PHI_RESULT (new_phi));
-
- /* 2.4. Record the newly created name with set_current_def.
- We want to find a name such that
- name = get_current_def (orig_loop_name)
- and to set its current definition as follows:
- set_current_def (name, new_phi_name)
-
- If LOOP is a new loop then loop_arg is already the name we're
- looking for. If LOOP is the original loop, then loop_arg is
- the orig_loop_name and the relevant name is recorded in its
- current reaching definition. */
- if (is_new_loop)
- current_new_name = loop_arg;
- else
- {
- current_new_name = get_current_def (loop_arg);
- /* current_def is not available only if the variable does not
- change inside the loop, in which case we also don't care
- about recording a current_def for it because we won't be
- trying to create loop-exit-phis for it. */
- if (!current_new_name)
- continue;
- }
- gcc_assert (get_current_def (current_new_name) == NULL_TREE);
+ if (is_gimple_reg (PHI_RESULT (orig_phi)))
+ {
+ /** 2. Handle loop-closed-ssa-form phis **/
+
+ /* 2.1. Generate new phi node in NEW_EXIT_BB: */
+ new_phi = create_phi_node (SSA_NAME_VAR (PHI_RESULT (orig_phi)),
+ *new_exit_bb);
+
+ /* 2.2. NEW_EXIT_BB has one incoming edge: the exit-edge of
+ the loop. */
+ add_phi_arg (new_phi, loop_arg, loop->single_exit);
+
+ /* 2.3. Update phi in successor of NEW_EXIT_BB: */
+ gcc_assert (PHI_ARG_DEF_FROM_EDGE (update_phi2, new_exit_e)
+ == loop_arg);
+ SET_PHI_ARG_DEF (update_phi2, new_exit_e->dest_idx,
+ PHI_RESULT (new_phi));
+
+ /* 2.4. Record the newly created name with set_current_def.
+ We want to find a name such that
+ name = get_current_def (orig_loop_name)
+ and to set its current definition as follows:
+ set_current_def (name, new_phi_name)
+
+ If LOOP is a new loop then loop_arg is already the name we're
+ looking for. If LOOP is the original loop, then loop_arg is
+ the orig_loop_name and the relevant name is recorded in its
+ current reaching definition. */
+
+ if (is_new_loop)
+ current_new_name = loop_arg;
+ else
+ {
+ current_new_name = get_current_def (loop_arg);
+
+ /* current_def is not available only if the variable
+ does not change inside the loop, in which case we
+ also don't care about recording a current_def for it
+ because we won't be trying to create loop-exit-phis
+ for it. */
+ if (!current_new_name)
+ continue;
+ }
+ gcc_assert (get_current_def (current_new_name) == NULL_TREE);
- set_current_def (current_new_name, PHI_RESULT (new_phi));
- bitmap_set_bit (*defs, SSA_NAME_VERSION (current_new_name));
+ set_current_def (current_new_name, PHI_RESULT (new_phi));
+ bitmap_set_bit (*defs, SSA_NAME_VERSION (current_new_name));
+ }
}
set_phi_nodes (new_merge_bb, phi_reverse (phi_nodes (new_merge_bb)));
@@ -2037,7 +2055,7 @@ vectorize_loops (struct loops *loops)
/* Allocate the bitmap that records which virtual variables that
need to be renamed. */
- vect_vnames_to_rename = BITMAP_ALLOC (NULL);
+ vect_memsyms_to_rename = BITMAP_ALLOC (NULL);
/* ----------- Analyze loops. ----------- */
@@ -2070,7 +2088,7 @@ vectorize_loops (struct loops *loops)
/* ----------- Finalize. ----------- */
- BITMAP_FREE (vect_vnames_to_rename);
+ BITMAP_FREE (vect_memsyms_to_rename);
for (i = 1; i < vect_loops_num; i++)
{
diff --git a/gcc/tree-vectorizer.h b/gcc/tree-vectorizer.h
index 3b7ca1b911a..2a8fece63e4 100644
--- a/gcc/tree-vectorizer.h
+++ b/gcc/tree-vectorizer.h
@@ -290,7 +290,7 @@ extern enum verbosity_levels vect_verbosity_level;
extern unsigned int vect_loops_num;
/* Bitmap of virtual variables to be renamed. */
-extern bitmap vect_vnames_to_rename;
+extern bitmap vect_memsyms_to_rename;
/*-----------------------------------------------------------------*/
/* Function prototypes. */
diff --git a/gcc/tree.h b/gcc/tree.h
index fbafa1ac142..76efebd53a6 100644
--- a/gcc/tree.h
+++ b/gcc/tree.h
@@ -1771,7 +1771,7 @@ struct tree_phi_node GTY(())
int num_args;
int capacity;
- /* Basic block to that the phi node belongs. */
+ /* Basic block holding this PHI node. */
struct basic_block_def *bb;
/* Arguments of the PHI node. These are maintained in the same